Feb 16 12:11:57 localhost kernel: Linux version 5.14.0-677.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Feb 6 13:57:07 UTC 2026
Feb 16 12:11:57 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 16 12:11:57 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64 root=UUID=19ee07ed-c14b-4aa3-804d-f2cbdae2694f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 16 12:11:57 localhost kernel: BIOS-provided physical RAM map:
Feb 16 12:11:57 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 16 12:11:57 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 16 12:11:57 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 16 12:11:57 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 16 12:11:57 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 16 12:11:57 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 16 12:11:57 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 16 12:11:57 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb 16 12:11:57 localhost kernel: NX (Execute Disable) protection: active
Feb 16 12:11:57 localhost kernel: APIC: Static calls initialized
Feb 16 12:11:57 localhost kernel: SMBIOS 2.8 present.
Feb 16 12:11:57 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 16 12:11:57 localhost kernel: Hypervisor detected: KVM
Feb 16 12:11:57 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 16 12:11:57 localhost kernel: kvm-clock: using sched offset of 9726568530 cycles
Feb 16 12:11:57 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 16 12:11:57 localhost kernel: tsc: Detected 2800.000 MHz processor
Feb 16 12:11:57 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 16 12:11:57 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 16 12:11:57 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb 16 12:11:57 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb 16 12:11:57 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 16 12:11:57 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 16 12:11:57 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 16 12:11:57 localhost kernel: Using GB pages for direct mapping
Feb 16 12:11:57 localhost kernel: RAMDISK: [mem 0x1b6e4000-0x29b69fff]
Feb 16 12:11:57 localhost kernel: ACPI: Early table checksum verification disabled
Feb 16 12:11:57 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 16 12:11:57 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:57 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:57 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:57 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 16 12:11:57 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:57 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 12:11:57 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 16 12:11:57 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 16 12:11:57 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 16 12:11:57 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 16 12:11:57 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 16 12:11:57 localhost kernel: No NUMA configuration found
Feb 16 12:11:57 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb 16 12:11:57 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Feb 16 12:11:57 localhost kernel: crashkernel reserved: 0x00000000a7000000 - 0x00000000b7000000 (256 MB)
Feb 16 12:11:57 localhost kernel: Zone ranges:
Feb 16 12:11:57 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 16 12:11:57 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 16 12:11:57 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb 16 12:11:57 localhost kernel:   Device   empty
Feb 16 12:11:57 localhost kernel: Movable zone start for each node
Feb 16 12:11:57 localhost kernel: Early memory node ranges
Feb 16 12:11:57 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 16 12:11:57 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 16 12:11:57 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb 16 12:11:57 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb 16 12:11:57 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 16 12:11:57 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 16 12:11:57 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 16 12:11:57 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 16 12:11:57 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 16 12:11:57 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 16 12:11:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 16 12:11:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 16 12:11:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 16 12:11:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 16 12:11:57 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 16 12:11:57 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 16 12:11:57 localhost kernel: TSC deadline timer available
Feb 16 12:11:57 localhost kernel: CPU topo: Max. logical packages:   8
Feb 16 12:11:57 localhost kernel: CPU topo: Max. logical dies:       8
Feb 16 12:11:57 localhost kernel: CPU topo: Max. dies per package:   1
Feb 16 12:11:57 localhost kernel: CPU topo: Max. threads per core:   1
Feb 16 12:11:57 localhost kernel: CPU topo: Num. cores per package:     1
Feb 16 12:11:57 localhost kernel: CPU topo: Num. threads per package:   1
Feb 16 12:11:57 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb 16 12:11:57 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb 16 12:11:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 16 12:11:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 16 12:11:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 16 12:11:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 16 12:11:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 16 12:11:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 16 12:11:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 16 12:11:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 16 12:11:57 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 16 12:11:57 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 16 12:11:57 localhost kernel: Booting paravirtualized kernel on KVM
Feb 16 12:11:57 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 16 12:11:57 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 16 12:11:57 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb 16 12:11:57 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Feb 16 12:11:57 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 16 12:11:57 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 16 12:11:57 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64 root=UUID=19ee07ed-c14b-4aa3-804d-f2cbdae2694f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 16 12:11:57 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64", will be passed to user space.
Feb 16 12:11:57 localhost kernel: random: crng init done
Feb 16 12:11:57 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 16 12:11:57 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb 16 12:11:57 localhost kernel: Fallback order for Node 0: 0 
Feb 16 12:11:57 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb 16 12:11:57 localhost kernel: Policy zone: Normal
Feb 16 12:11:57 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 16 12:11:57 localhost kernel: software IO TLB: area num 8.
Feb 16 12:11:57 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 16 12:11:57 localhost kernel: ftrace: allocating 49543 entries in 194 pages
Feb 16 12:11:57 localhost kernel: ftrace: allocated 194 pages with 3 groups
Feb 16 12:11:57 localhost kernel: Dynamic Preempt: voluntary
Feb 16 12:11:57 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 16 12:11:57 localhost kernel: rcu:         RCU event tracing is enabled.
Feb 16 12:11:57 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 16 12:11:57 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 16 12:11:57 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 16 12:11:57 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 16 12:11:57 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 16 12:11:57 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 16 12:11:57 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 16 12:11:57 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 16 12:11:57 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 16 12:11:57 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 16 12:11:57 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 16 12:11:57 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 16 12:11:57 localhost kernel: Console: colour VGA+ 80x25
Feb 16 12:11:57 localhost kernel: printk: console [ttyS0] enabled
Feb 16 12:11:57 localhost kernel: ACPI: Core revision 20230331
Feb 16 12:11:57 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 16 12:11:57 localhost kernel: x2apic enabled
Feb 16 12:11:57 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Feb 16 12:11:57 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 16 12:11:57 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Feb 16 12:11:57 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 16 12:11:57 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 16 12:11:57 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 16 12:11:57 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb 16 12:11:57 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 16 12:11:57 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 16 12:11:57 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 16 12:11:57 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb 16 12:11:57 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 16 12:11:57 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb 16 12:11:57 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 16 12:11:57 localhost kernel: active return thunk: retbleed_return_thunk
Feb 16 12:11:57 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 16 12:11:57 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 16 12:11:57 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 16 12:11:57 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 16 12:11:57 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 16 12:11:57 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb 16 12:11:57 localhost kernel: Freeing SMP alternatives memory: 40K
Feb 16 12:11:57 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 16 12:11:57 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb 16 12:11:57 localhost kernel: landlock: Up and running.
Feb 16 12:11:57 localhost kernel: Yama: becoming mindful.
Feb 16 12:11:57 localhost kernel: SELinux:  Initializing.
Feb 16 12:11:57 localhost kernel: LSM support for eBPF active
Feb 16 12:11:57 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 16 12:11:57 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 16 12:11:57 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 16 12:11:57 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 16 12:11:57 localhost kernel: ... version:                0
Feb 16 12:11:57 localhost kernel: ... bit width:              48
Feb 16 12:11:57 localhost kernel: ... generic registers:      6
Feb 16 12:11:57 localhost kernel: ... value mask:             0000ffffffffffff
Feb 16 12:11:57 localhost kernel: ... max period:             00007fffffffffff
Feb 16 12:11:57 localhost kernel: ... fixed-purpose events:   0
Feb 16 12:11:57 localhost kernel: ... event mask:             000000000000003f
Feb 16 12:11:57 localhost kernel: signal: max sigframe size: 1776
Feb 16 12:11:57 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 16 12:11:57 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 16 12:11:57 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 16 12:11:57 localhost kernel: smpboot: x86: Booting SMP configuration:
Feb 16 12:11:57 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 16 12:11:57 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 16 12:11:57 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Feb 16 12:11:57 localhost kernel: node 0 deferred pages initialised in 10ms
Feb 16 12:11:57 localhost kernel: Memory: 7617500K/8388068K available (16384K kernel code, 5795K rwdata, 13944K rodata, 4204K init, 7180K bss, 764416K reserved, 0K cma-reserved)
Feb 16 12:11:57 localhost kernel: devtmpfs: initialized
Feb 16 12:11:57 localhost kernel: x86/mm: Memory block size: 128MB
Feb 16 12:11:57 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 16 12:11:57 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb 16 12:11:57 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 16 12:11:57 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 16 12:11:57 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb 16 12:11:57 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 16 12:11:57 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 16 12:11:57 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 16 12:11:57 localhost kernel: audit: type=2000 audit(1771243916.183:1): state=initialized audit_enabled=0 res=1
Feb 16 12:11:57 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 16 12:11:57 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 16 12:11:57 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 16 12:11:57 localhost kernel: cpuidle: using governor menu
Feb 16 12:11:57 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 16 12:11:57 localhost kernel: PCI: Using configuration type 1 for base access
Feb 16 12:11:57 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 16 12:11:57 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 16 12:11:57 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb 16 12:11:57 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb 16 12:11:57 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb 16 12:11:57 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb 16 12:11:57 localhost kernel: Demotion targets for Node 0: null
Feb 16 12:11:57 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 16 12:11:57 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 16 12:11:57 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 16 12:11:57 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 16 12:11:57 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 16 12:11:57 localhost kernel: ACPI: Interpreter enabled
Feb 16 12:11:57 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 16 12:11:57 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 16 12:11:57 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 16 12:11:57 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 16 12:11:57 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 16 12:11:57 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 16 12:11:57 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [3] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [4] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [5] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [6] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [7] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [8] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [9] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [10] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [11] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [12] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [13] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [14] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [15] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [16] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [17] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [18] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [19] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [20] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [21] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [22] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [23] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [24] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [25] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [26] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [27] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [28] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [29] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [30] registered
Feb 16 12:11:57 localhost kernel: acpiphp: Slot [31] registered
Feb 16 12:11:57 localhost kernel: PCI host bridge to bus 0000:00
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 16 12:11:57 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 16 12:11:57 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb 16 12:11:57 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 16 12:11:57 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb 16 12:11:57 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb 16 12:11:57 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 16 12:11:57 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb 16 12:11:57 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb 16 12:11:57 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 16 12:11:57 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb 16 12:11:57 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb 16 12:11:57 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb 16 12:11:57 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 16 12:11:57 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb 16 12:11:57 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 16 12:11:57 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb 16 12:11:57 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb 16 12:11:57 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 16 12:11:57 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 16 12:11:57 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 16 12:11:57 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 16 12:11:57 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 16 12:11:57 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 16 12:11:57 localhost kernel: iommu: Default domain type: Translated
Feb 16 12:11:57 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb 16 12:11:57 localhost kernel: SCSI subsystem initialized
Feb 16 12:11:57 localhost kernel: ACPI: bus type USB registered
Feb 16 12:11:57 localhost kernel: usbcore: registered new interface driver usbfs
Feb 16 12:11:57 localhost kernel: usbcore: registered new interface driver hub
Feb 16 12:11:57 localhost kernel: usbcore: registered new device driver usb
Feb 16 12:11:57 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 16 12:11:57 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 16 12:11:57 localhost kernel: PTP clock support registered
Feb 16 12:11:57 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 16 12:11:57 localhost kernel: NetLabel: Initializing
Feb 16 12:11:57 localhost kernel: NetLabel:  domain hash size = 128
Feb 16 12:11:57 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 16 12:11:57 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 16 12:11:57 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 16 12:11:57 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 16 12:11:57 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 16 12:11:57 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 16 12:11:57 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 16 12:11:57 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 16 12:11:57 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 16 12:11:57 localhost kernel: vgaarb: loaded
Feb 16 12:11:57 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 16 12:11:57 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 16 12:11:57 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 16 12:11:57 localhost kernel: pnp: PnP ACPI init
Feb 16 12:11:57 localhost kernel: pnp 00:03: [dma 2]
Feb 16 12:11:57 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 16 12:11:57 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 16 12:11:57 localhost kernel: NET: Registered PF_INET protocol family
Feb 16 12:11:57 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 16 12:11:57 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb 16 12:11:57 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 16 12:11:57 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb 16 12:11:57 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 16 12:11:57 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb 16 12:11:57 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb 16 12:11:57 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 16 12:11:57 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 16 12:11:57 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 16 12:11:57 localhost kernel: NET: Registered PF_XDP protocol family
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 16 12:11:57 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 16 12:11:57 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 16 12:11:57 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 16 12:11:57 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 22354 usecs
Feb 16 12:11:57 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 16 12:11:57 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 16 12:11:57 localhost kernel: software IO TLB: mapped [mem 0x00000000bbfdb000-0x00000000bffdb000] (64MB)
Feb 16 12:11:57 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 16 12:11:57 localhost kernel: ACPI: bus type thunderbolt registered
Feb 16 12:11:57 localhost kernel: Initialise system trusted keyrings
Feb 16 12:11:57 localhost kernel: Key type blacklist registered
Feb 16 12:11:57 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb 16 12:11:57 localhost kernel: zbud: loaded
Feb 16 12:11:57 localhost kernel: integrity: Platform Keyring initialized
Feb 16 12:11:57 localhost kernel: integrity: Machine keyring initialized
Feb 16 12:11:57 localhost kernel: Freeing initrd memory: 234008K
Feb 16 12:11:57 localhost kernel: NET: Registered PF_ALG protocol family
Feb 16 12:11:57 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 16 12:11:57 localhost kernel: Key type asymmetric registered
Feb 16 12:11:57 localhost kernel: Asymmetric key parser 'x509' registered
Feb 16 12:11:57 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 16 12:11:57 localhost kernel: io scheduler mq-deadline registered
Feb 16 12:11:57 localhost kernel: io scheduler kyber registered
Feb 16 12:11:57 localhost kernel: io scheduler bfq registered
Feb 16 12:11:57 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 16 12:11:57 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 16 12:11:57 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 16 12:11:57 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 16 12:11:57 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 16 12:11:57 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 16 12:11:57 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 16 12:11:57 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 16 12:11:57 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 16 12:11:57 localhost kernel: Non-volatile memory driver v1.3
Feb 16 12:11:57 localhost kernel: rdac: device handler registered
Feb 16 12:11:57 localhost kernel: hp_sw: device handler registered
Feb 16 12:11:57 localhost kernel: emc: device handler registered
Feb 16 12:11:57 localhost kernel: alua: device handler registered
Feb 16 12:11:57 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 16 12:11:57 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 16 12:11:57 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 16 12:11:57 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 16 12:11:57 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 16 12:11:57 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 16 12:11:57 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 16 12:11:57 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-677.el9.x86_64 uhci_hcd
Feb 16 12:11:57 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 16 12:11:57 localhost kernel: hub 1-0:1.0: USB hub found
Feb 16 12:11:57 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 16 12:11:57 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 16 12:11:57 localhost kernel: usbserial: USB Serial support registered for generic
Feb 16 12:11:57 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 16 12:11:57 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 16 12:11:57 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 16 12:11:57 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 16 12:11:57 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 16 12:11:57 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 16 12:11:57 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-16T12:11:56 UTC (1771243916)
Feb 16 12:11:57 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 16 12:11:57 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb 16 12:11:57 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 16 12:11:57 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 16 12:11:57 localhost kernel: usbcore: registered new interface driver usbhid
Feb 16 12:11:57 localhost kernel: usbhid: USB HID core driver
Feb 16 12:11:57 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 16 12:11:57 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 16 12:11:57 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 16 12:11:57 localhost kernel: Initializing XFRM netlink socket
Feb 16 12:11:57 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 16 12:11:57 localhost kernel: Segment Routing with IPv6
Feb 16 12:11:57 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 16 12:11:57 localhost kernel: mpls_gso: MPLS GSO support
Feb 16 12:11:57 localhost kernel: IPI shorthand broadcast: enabled
Feb 16 12:11:57 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 16 12:11:57 localhost kernel: AES CTR mode by8 optimization enabled
Feb 16 12:11:57 localhost kernel: sched_clock: Marking stable (1015002659, 159585320)->(1272101639, -97513660)
Feb 16 12:11:57 localhost kernel: registered taskstats version 1
Feb 16 12:11:57 localhost kernel: Loading compiled-in X.509 certificates
Feb 16 12:11:57 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 59012b35a0d3f62f49a40ad60f91f66a06ca3be0'
Feb 16 12:11:57 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 16 12:11:57 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 16 12:11:57 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb 16 12:11:57 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb 16 12:11:57 localhost kernel: Demotion targets for Node 0: null
Feb 16 12:11:57 localhost kernel: page_owner is disabled
Feb 16 12:11:57 localhost kernel: Key type .fscrypt registered
Feb 16 12:11:57 localhost kernel: Key type fscrypt-provisioning registered
Feb 16 12:11:57 localhost kernel: Key type big_key registered
Feb 16 12:11:57 localhost kernel: Key type encrypted registered
Feb 16 12:11:57 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 16 12:11:57 localhost kernel: Loading compiled-in module X.509 certificates
Feb 16 12:11:57 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 59012b35a0d3f62f49a40ad60f91f66a06ca3be0'
Feb 16 12:11:57 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 16 12:11:57 localhost kernel: ima: No architecture policies found
Feb 16 12:11:57 localhost kernel: evm: Initialising EVM extended attributes:
Feb 16 12:11:57 localhost kernel: evm: security.selinux
Feb 16 12:11:57 localhost kernel: evm: security.SMACK64 (disabled)
Feb 16 12:11:57 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 16 12:11:57 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 16 12:11:57 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 16 12:11:57 localhost kernel: evm: security.apparmor (disabled)
Feb 16 12:11:57 localhost kernel: evm: security.ima
Feb 16 12:11:57 localhost kernel: evm: security.capability
Feb 16 12:11:57 localhost kernel: evm: HMAC attrs: 0x1
Feb 16 12:11:57 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 16 12:11:57 localhost kernel: Running certificate verification RSA selftest
Feb 16 12:11:57 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 16 12:11:57 localhost kernel: Running certificate verification ECDSA selftest
Feb 16 12:11:57 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb 16 12:11:57 localhost kernel: clk: Disabling unused clocks
Feb 16 12:11:57 localhost kernel: Freeing unused decrypted memory: 2028K
Feb 16 12:11:57 localhost kernel: Freeing unused kernel image (initmem) memory: 4204K
Feb 16 12:11:57 localhost kernel: Write protecting the kernel read-only data: 30720k
Feb 16 12:11:57 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 392K
Feb 16 12:11:57 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 16 12:11:57 localhost kernel: Run /init as init process
Feb 16 12:11:57 localhost kernel:   with arguments:
Feb 16 12:11:57 localhost kernel:     /init
Feb 16 12:11:57 localhost kernel:   with environment:
Feb 16 12:11:57 localhost kernel:     HOME=/
Feb 16 12:11:57 localhost kernel:     TERM=linux
Feb 16 12:11:57 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64
Feb 16 12:11:57 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 16 12:11:57 localhost systemd[1]: Detected virtualization kvm.
Feb 16 12:11:57 localhost systemd[1]: Detected architecture x86-64.
Feb 16 12:11:57 localhost systemd[1]: Running in initrd.
Feb 16 12:11:57 localhost systemd[1]: No hostname configured, using default hostname.
Feb 16 12:11:57 localhost systemd[1]: Hostname set to <localhost>.
Feb 16 12:11:57 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 16 12:11:57 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 16 12:11:57 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 16 12:11:57 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 16 12:11:57 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 16 12:11:57 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 16 12:11:57 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 16 12:11:57 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 16 12:11:57 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 16 12:11:57 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 16 12:11:57 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 16 12:11:57 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 16 12:11:57 localhost systemd[1]: Reached target Local File Systems.
Feb 16 12:11:57 localhost systemd[1]: Reached target Path Units.
Feb 16 12:11:57 localhost systemd[1]: Reached target Slice Units.
Feb 16 12:11:57 localhost systemd[1]: Reached target Swaps.
Feb 16 12:11:57 localhost systemd[1]: Reached target Timer Units.
Feb 16 12:11:57 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 16 12:11:57 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 16 12:11:57 localhost systemd[1]: Listening on Journal Socket.
Feb 16 12:11:57 localhost systemd[1]: Listening on udev Control Socket.
Feb 16 12:11:57 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 16 12:11:57 localhost systemd[1]: Reached target Socket Units.
Feb 16 12:11:57 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 16 12:11:57 localhost systemd[1]: Starting Journal Service...
Feb 16 12:11:57 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 16 12:11:57 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 16 12:11:57 localhost systemd[1]: Starting Create System Users...
Feb 16 12:11:57 localhost systemd[1]: Starting Setup Virtual Console...
Feb 16 12:11:57 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 16 12:11:57 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 16 12:11:57 localhost systemd-journald[308]: Journal started
Feb 16 12:11:57 localhost systemd-journald[308]: Runtime Journal (/run/log/journal/66cb33d6ab364d79b328a95e41812fbd) is 8.0M, max 153.6M, 145.6M free.
Feb 16 12:11:57 localhost systemd[1]: Started Journal Service.
Feb 16 12:11:57 localhost systemd-sysusers[313]: Creating group 'users' with GID 100.
Feb 16 12:11:57 localhost systemd-sysusers[313]: Creating group 'dbus' with GID 81.
Feb 16 12:11:57 localhost systemd-sysusers[313]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 16 12:11:57 localhost systemd[1]: Finished Create System Users.
Feb 16 12:11:57 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 16 12:11:57 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 16 12:11:57 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 16 12:11:57 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 16 12:11:57 localhost systemd[1]: Finished Setup Virtual Console.
Feb 16 12:11:57 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 16 12:11:57 localhost systemd[1]: Starting dracut cmdline hook...
Feb 16 12:11:57 localhost dracut-cmdline[328]: dracut-9 dracut-057-110.git20260130.el9
Feb 16 12:11:57 localhost dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64 root=UUID=19ee07ed-c14b-4aa3-804d-f2cbdae2694f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 16 12:11:57 localhost systemd[1]: Finished dracut cmdline hook.
Feb 16 12:11:57 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 16 12:11:57 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 16 12:11:57 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 16 12:11:57 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb 16 12:11:57 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 16 12:11:57 localhost kernel: RPC: Registered udp transport module.
Feb 16 12:11:57 localhost kernel: RPC: Registered tcp transport module.
Feb 16 12:11:57 localhost kernel: RPC: Registered tcp-with-tls transport module.
Feb 16 12:11:57 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 16 12:11:57 localhost rpc.statd[446]: Version 2.5.4 starting
Feb 16 12:11:57 localhost rpc.statd[446]: Initializing NSM state
Feb 16 12:11:57 localhost rpc.idmapd[451]: Setting log level to 0
Feb 16 12:11:57 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 16 12:11:57 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 16 12:11:57 localhost systemd-udevd[464]: Using default interface naming scheme 'rhel-9.0'.
Feb 16 12:11:57 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 16 12:11:57 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 16 12:11:57 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 16 12:11:57 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 16 12:11:57 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 16 12:11:57 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 16 12:11:57 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 16 12:11:57 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 16 12:11:57 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 16 12:11:57 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 16 12:11:57 localhost systemd[1]: Reached target Network.
Feb 16 12:11:57 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 16 12:11:57 localhost systemd[1]: Starting dracut initqueue hook...
Feb 16 12:11:57 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb 16 12:11:57 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb 16 12:11:57 localhost kernel:  vda: vda1
Feb 16 12:11:57 localhost kernel: libata version 3.00 loaded.
Feb 16 12:11:57 localhost systemd-udevd[477]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 12:11:57 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 16 12:11:57 localhost kernel: scsi host0: ata_piix
Feb 16 12:11:57 localhost kernel: scsi host1: ata_piix
Feb 16 12:11:57 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb 16 12:11:57 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb 16 12:11:57 localhost kernel: ACPI: bus type drm_connector registered
Feb 16 12:11:57 localhost systemd[1]: Found device /dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f.
Feb 16 12:11:57 localhost systemd[1]: Reached target Initrd Root Device.
Feb 16 12:11:57 localhost kernel: ata1: found unknown device (class 0)
Feb 16 12:11:57 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 16 12:11:57 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 16 12:11:57 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 16 12:11:57 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 16 12:11:57 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 16 12:11:57 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 16 12:11:57 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 16 12:11:57 localhost kernel: Console: switching to colour dummy device 80x25
Feb 16 12:11:57 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 16 12:11:57 localhost kernel: [drm] features: -context_init
Feb 16 12:11:57 localhost kernel: [drm] number of scanouts: 1
Feb 16 12:11:57 localhost kernel: [drm] number of cap sets: 0
Feb 16 12:11:57 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb 16 12:11:57 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb 16 12:11:57 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 16 12:11:57 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 16 12:11:57 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 16 12:11:58 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 16 12:11:58 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 16 12:11:58 localhost systemd[1]: Reached target System Initialization.
Feb 16 12:11:58 localhost systemd[1]: Reached target Basic System.
Feb 16 12:11:58 localhost systemd[1]: Finished dracut initqueue hook.
Feb 16 12:11:58 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 16 12:11:58 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 16 12:11:58 localhost systemd[1]: Reached target Remote File Systems.
Feb 16 12:11:58 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 16 12:11:58 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 16 12:11:58 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f...
Feb 16 12:11:58 localhost systemd-fsck[567]: /usr/sbin/fsck.xfs: XFS file system.
Feb 16 12:11:58 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f.
Feb 16 12:11:58 localhost systemd[1]: Mounting /sysroot...
Feb 16 12:11:58 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 16 12:11:58 localhost kernel: XFS (vda1): Mounting V5 Filesystem 19ee07ed-c14b-4aa3-804d-f2cbdae2694f
Feb 16 12:11:58 localhost kernel: XFS (vda1): Ending clean mount
Feb 16 12:11:58 localhost systemd[1]: Mounted /sysroot.
Feb 16 12:11:58 localhost systemd[1]: Reached target Initrd Root File System.
Feb 16 12:11:58 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 16 12:11:58 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 16 12:11:58 localhost systemd[1]: Reached target Initrd File Systems.
Feb 16 12:11:58 localhost systemd[1]: Reached target Initrd Default Target.
Feb 16 12:11:58 localhost systemd[1]: Starting dracut mount hook...
Feb 16 12:11:58 localhost systemd[1]: Finished dracut mount hook.
Feb 16 12:11:58 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 16 12:11:58 localhost rpc.idmapd[451]: exiting on signal 15
Feb 16 12:11:58 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 16 12:11:58 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 16 12:11:58 localhost systemd[1]: Stopped target Network.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Timer Units.
Feb 16 12:11:58 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 16 12:11:58 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Basic System.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Path Units.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Remote File Systems.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Slice Units.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Socket Units.
Feb 16 12:11:58 localhost systemd[1]: Stopped target System Initialization.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Local File Systems.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Swaps.
Feb 16 12:11:58 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped dracut mount hook.
Feb 16 12:11:58 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 16 12:11:58 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 16 12:11:58 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 16 12:11:58 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 16 12:11:58 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 16 12:11:58 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 16 12:11:58 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 16 12:11:58 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 16 12:11:58 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 16 12:11:58 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 16 12:11:58 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 16 12:11:58 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Closed udev Control Socket.
Feb 16 12:11:58 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Closed udev Kernel Socket.
Feb 16 12:11:58 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 16 12:11:58 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 16 12:11:58 localhost systemd[1]: Starting Cleanup udev Database...
Feb 16 12:11:58 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 16 12:11:58 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 16 12:11:58 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Stopped Create System Users.
Feb 16 12:11:58 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 16 12:11:58 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 16 12:11:58 localhost systemd[1]: Finished Cleanup udev Database.
Feb 16 12:11:58 localhost systemd[1]: Reached target Switch Root.
Feb 16 12:11:58 localhost systemd[1]: Starting Switch Root...
Feb 16 12:11:59 localhost systemd[1]: Switching root.
Feb 16 12:11:59 localhost systemd-journald[308]: Journal stopped
Feb 16 12:11:59 localhost systemd-journald[308]: Received SIGTERM from PID 1 (n/a).
Feb 16 12:11:59 localhost kernel: audit: type=1404 audit(1771243919.196:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 16 12:11:59 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 12:11:59 localhost kernel: SELinux:  policy capability open_perms=1
Feb 16 12:11:59 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 12:11:59 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 16 12:11:59 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 12:11:59 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 12:11:59 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 12:11:59 localhost kernel: audit: type=1403 audit(1771243919.325:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 16 12:11:59 localhost systemd[1]: Successfully loaded SELinux policy in 131.996ms.
Feb 16 12:11:59 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.062ms.
Feb 16 12:11:59 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 16 12:11:59 localhost systemd[1]: Detected virtualization kvm.
Feb 16 12:11:59 localhost systemd[1]: Detected architecture x86-64.
Feb 16 12:11:59 localhost systemd-rc-local-generator[649]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:11:59 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 16 12:11:59 localhost systemd[1]: Stopped Switch Root.
Feb 16 12:11:59 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 16 12:11:59 localhost systemd[1]: Created slice Slice /system/getty.
Feb 16 12:11:59 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 16 12:11:59 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 16 12:11:59 localhost systemd[1]: Created slice User and Session Slice.
Feb 16 12:11:59 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 16 12:11:59 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 16 12:11:59 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 16 12:11:59 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 16 12:11:59 localhost systemd[1]: Stopped target Switch Root.
Feb 16 12:11:59 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 16 12:11:59 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 16 12:11:59 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 16 12:11:59 localhost systemd[1]: Reached target Path Units.
Feb 16 12:11:59 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 16 12:11:59 localhost systemd[1]: Reached target Slice Units.
Feb 16 12:11:59 localhost systemd[1]: Reached target Swaps.
Feb 16 12:11:59 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 16 12:11:59 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 16 12:11:59 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 16 12:11:59 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 16 12:11:59 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 16 12:11:59 localhost systemd[1]: Listening on udev Control Socket.
Feb 16 12:11:59 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 16 12:11:59 localhost systemd[1]: Mounting Huge Pages File System...
Feb 16 12:11:59 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 16 12:11:59 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 16 12:11:59 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 16 12:11:59 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 16 12:11:59 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 16 12:11:59 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 16 12:11:59 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 16 12:11:59 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Feb 16 12:11:59 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 16 12:11:59 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 16 12:11:59 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 16 12:11:59 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 16 12:11:59 localhost systemd[1]: Stopped Journal Service.
Feb 16 12:11:59 localhost systemd[1]: Starting Journal Service...
Feb 16 12:11:59 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 16 12:11:59 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 16 12:11:59 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 16 12:11:59 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 16 12:11:59 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 16 12:11:59 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 16 12:11:59 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 16 12:11:59 localhost systemd-journald[697]: Journal started
Feb 16 12:11:59 localhost systemd-journald[697]: Runtime Journal (/run/log/journal/c582f88d1fdab2d576c3dadef84540f2) is 8.0M, max 153.6M, 145.6M free.
Feb 16 12:11:59 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 16 12:11:59 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 16 12:11:59 localhost kernel: fuse: init (API version 7.37)
Feb 16 12:11:59 localhost systemd[1]: Started Journal Service.
Feb 16 12:11:59 localhost systemd[1]: Mounted Huge Pages File System.
Feb 16 12:11:59 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 16 12:11:59 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 16 12:11:59 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 16 12:11:59 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 16 12:11:59 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 16 12:11:59 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 16 12:11:59 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 16 12:12:00 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 16 12:12:00 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Feb 16 12:12:00 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 16 12:12:00 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 16 12:12:00 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 16 12:12:00 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 16 12:12:00 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 16 12:12:00 localhost systemd[1]: Mounting FUSE Control File System...
Feb 16 12:12:00 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 16 12:12:00 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 16 12:12:00 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 16 12:12:00 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb 16 12:12:00 localhost systemd[1]: Starting Load/Save OS Random Seed...
Feb 16 12:12:00 localhost systemd[1]: Starting Create System Users...
Feb 16 12:12:00 localhost systemd-journald[697]: Runtime Journal (/run/log/journal/c582f88d1fdab2d576c3dadef84540f2) is 8.0M, max 153.6M, 145.6M free.
Feb 16 12:12:00 localhost systemd[1]: Mounted FUSE Control File System.
Feb 16 12:12:00 localhost systemd-journald[697]: Received client request to flush runtime journal.
Feb 16 12:12:00 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 16 12:12:00 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 16 12:12:00 localhost systemd[1]: Finished Load/Save OS Random Seed.
Feb 16 12:12:00 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 16 12:12:00 localhost systemd[1]: Finished Create System Users.
Feb 16 12:12:00 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 16 12:12:00 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 16 12:12:00 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 16 12:12:00 localhost systemd[1]: Reached target Local File Systems.
Feb 16 12:12:00 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 16 12:12:00 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 16 12:12:00 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 16 12:12:00 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb 16 12:12:00 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 16 12:12:00 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 16 12:12:00 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 16 12:12:00 localhost bootctl[714]: Couldn't find EFI system partition, skipping.
Feb 16 12:12:00 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 16 12:12:00 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 16 12:12:00 localhost systemd[1]: Starting Security Auditing Service...
Feb 16 12:12:00 localhost systemd[1]: Starting RPC Bind...
Feb 16 12:12:00 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 16 12:12:00 localhost auditd[720]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb 16 12:12:00 localhost auditd[720]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb 16 12:12:00 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 16 12:12:00 localhost systemd[1]: Started RPC Bind.
Feb 16 12:12:00 localhost augenrules[725]: /sbin/augenrules: No change
Feb 16 12:12:00 localhost augenrules[740]: No rules
Feb 16 12:12:00 localhost augenrules[740]: enabled 1
Feb 16 12:12:00 localhost augenrules[740]: failure 1
Feb 16 12:12:00 localhost augenrules[740]: pid 720
Feb 16 12:12:00 localhost augenrules[740]: rate_limit 0
Feb 16 12:12:00 localhost augenrules[740]: backlog_limit 8192
Feb 16 12:12:00 localhost augenrules[740]: lost 0
Feb 16 12:12:00 localhost augenrules[740]: backlog 0
Feb 16 12:12:00 localhost augenrules[740]: backlog_wait_time 60000
Feb 16 12:12:00 localhost augenrules[740]: backlog_wait_time_actual 0
Feb 16 12:12:00 localhost augenrules[740]: enabled 1
Feb 16 12:12:00 localhost augenrules[740]: failure 1
Feb 16 12:12:00 localhost augenrules[740]: pid 720
Feb 16 12:12:00 localhost augenrules[740]: rate_limit 0
Feb 16 12:12:00 localhost augenrules[740]: backlog_limit 8192
Feb 16 12:12:00 localhost augenrules[740]: lost 0
Feb 16 12:12:00 localhost augenrules[740]: backlog 4
Feb 16 12:12:00 localhost augenrules[740]: backlog_wait_time 60000
Feb 16 12:12:00 localhost augenrules[740]: backlog_wait_time_actual 0
Feb 16 12:12:00 localhost augenrules[740]: enabled 1
Feb 16 12:12:00 localhost augenrules[740]: failure 1
Feb 16 12:12:00 localhost augenrules[740]: pid 720
Feb 16 12:12:00 localhost augenrules[740]: rate_limit 0
Feb 16 12:12:00 localhost augenrules[740]: backlog_limit 8192
Feb 16 12:12:00 localhost augenrules[740]: lost 0
Feb 16 12:12:00 localhost augenrules[740]: backlog 4
Feb 16 12:12:00 localhost augenrules[740]: backlog_wait_time 60000
Feb 16 12:12:00 localhost augenrules[740]: backlog_wait_time_actual 0
Feb 16 12:12:00 localhost systemd[1]: Started Security Auditing Service.
Feb 16 12:12:00 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 16 12:12:00 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 16 12:12:00 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 16 12:12:00 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 16 12:12:00 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 16 12:12:00 localhost systemd[1]: Starting Update is Completed...
Feb 16 12:12:00 localhost systemd-udevd[748]: Using default interface naming scheme 'rhel-9.0'.
Feb 16 12:12:00 localhost systemd[1]: Finished Update is Completed.
Feb 16 12:12:00 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 16 12:12:00 localhost systemd[1]: Reached target System Initialization.
Feb 16 12:12:00 localhost systemd[1]: Started dnf makecache --timer.
Feb 16 12:12:00 localhost systemd[1]: Started Daily rotation of log files.
Feb 16 12:12:00 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 16 12:12:00 localhost systemd[1]: Reached target Timer Units.
Feb 16 12:12:00 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 16 12:12:00 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 16 12:12:00 localhost systemd[1]: Reached target Socket Units.
Feb 16 12:12:00 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 16 12:12:00 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 16 12:12:00 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 16 12:12:00 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 16 12:12:00 localhost systemd-udevd[754]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 12:12:00 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 16 12:12:00 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 16 12:12:00 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 16 12:12:00 localhost systemd[1]: Reached target Basic System.
Feb 16 12:12:00 localhost dbus-broker-lau[784]: Ready
Feb 16 12:12:00 localhost systemd[1]: Starting NTP client/server...
Feb 16 12:12:00 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 16 12:12:00 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 16 12:12:00 localhost chronyd[801]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 16 12:12:00 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb 16 12:12:00 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb 16 12:12:00 localhost chronyd[801]: Loaded 0 symmetric keys
Feb 16 12:12:00 localhost chronyd[801]: Using right/UTC timezone to obtain leap second data
Feb 16 12:12:00 localhost chronyd[801]: Loaded seccomp filter (level 2)
Feb 16 12:12:00 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb 16 12:12:00 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 16 12:12:00 localhost systemd[1]: Starting IPv4 firewall with iptables...
Feb 16 12:12:00 localhost systemd[1]: Started irqbalance daemon.
Feb 16 12:12:00 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 16 12:12:00 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 12:12:00 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 12:12:00 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 12:12:00 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 16 12:12:00 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 16 12:12:00 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 16 12:12:00 localhost kernel: kvm_amd: TSC scaling supported
Feb 16 12:12:00 localhost kernel: kvm_amd: Nested Virtualization enabled
Feb 16 12:12:00 localhost kernel: kvm_amd: Nested Paging enabled
Feb 16 12:12:00 localhost kernel: kvm_amd: LBR virtualization supported
Feb 16 12:12:00 localhost systemd[1]: Starting User Login Management...
Feb 16 12:12:00 localhost systemd[1]: Started NTP client/server.
Feb 16 12:12:00 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 16 12:12:00 localhost systemd-logind[821]: New seat seat0.
Feb 16 12:12:00 localhost systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 16 12:12:00 localhost systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 16 12:12:00 localhost systemd[1]: Started User Login Management.
Feb 16 12:12:00 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 16 12:12:01 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb 16 12:12:01 localhost iptables.init[813]: iptables: Applying firewall rules: [  OK  ]
Feb 16 12:12:01 localhost systemd[1]: Finished IPv4 firewall with iptables.
Feb 16 12:12:01 localhost cloud-init[851]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 16 Feb 2026 12:12:01 +0000. Up 6.05 seconds.
Feb 16 12:12:01 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 16 12:12:01 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 16 12:12:01 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp55nr_7kt.mount: Deactivated successfully.
Feb 16 12:12:01 localhost systemd[1]: Starting Hostname Service...
Feb 16 12:12:01 localhost systemd[1]: Started Hostname Service.
Feb 16 12:12:01 np0005620857.novalocal systemd-hostnamed[865]: Hostname set to <np0005620857.novalocal> (static)
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Reached target Preparation for Network.
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Starting Network Manager...
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.1839] NetworkManager (version 1.54.3-2.el9) is starting... (boot:cd836bab-140a-4a06-bcbf-b453ec38ea52)
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.1845] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2035] manager[0x55ac2f524000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2092] hostname: hostname: using hostnamed
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2092] hostname: static hostname changed from (none) to "np0005620857.novalocal"
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2098] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2216] manager[0x55ac2f524000]: rfkill: Wi-Fi hardware radio set enabled
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2217] manager[0x55ac2f524000]: rfkill: WWAN hardware radio set enabled
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2320] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2321] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2322] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2322] manager: Networking is enabled by state file
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2325] settings: Loaded settings plugin: keyfile (internal)
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2373] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2398] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2414] dhcp: init: Using DHCP client 'internal'
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2419] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2435] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2448] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2461] device (lo): Activation: starting connection 'lo' (1e10248c-d525-48d3-b66b-d34bc8862c9f)
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2470] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2473] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2503] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2508] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2510] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2512] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2514] device (eth0): carrier: link connected
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2517] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2524] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Started Network Manager.
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2530] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2535] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2536] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2537] manager: NetworkManager state is now CONNECTING
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2538] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2544] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2547] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Reached target Network.
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2581] dhcp4 (eth0): state changed new lease, address=38.102.83.251
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2587] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2604] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2685] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2688] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2689] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2693] device (lo): Activation: successful, device activated.
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2699] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2702] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2704] device (eth0): Activation: successful, device activated.
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2709] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 16 12:12:02 np0005620857.novalocal NetworkManager[869]: <info>  [1771243922.2712] manager: startup complete
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Reached target NFS client services.
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Reached target Remote File Systems.
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 16 12:12:02 np0005620857.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 16 Feb 2026 12:12:02 +0000. Up 7.05 seconds.
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: |  eth0  | True |        38.102.83.251         | 255.255.255.0 | global | fa:16:3e:69:15:88 |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: |  eth0  | True | fe80::f816:3eff:fe69:1588/64 |       .       |  link  | fa:16:3e:69:15:88 |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 16 12:12:02 np0005620857.novalocal cloud-init[932]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 16 12:12:03 np0005620857.novalocal useradd[998]: new group: name=cloud-user, GID=1001
Feb 16 12:12:03 np0005620857.novalocal useradd[998]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 16 12:12:03 np0005620857.novalocal useradd[998]: add 'cloud-user' to group 'adm'
Feb 16 12:12:03 np0005620857.novalocal useradd[998]: add 'cloud-user' to group 'systemd-journal'
Feb 16 12:12:03 np0005620857.novalocal useradd[998]: add 'cloud-user' to shadow group 'adm'
Feb 16 12:12:03 np0005620857.novalocal useradd[998]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: Generating public/private rsa key pair.
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: The key fingerprint is:
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: SHA256:b4KJbfJ6tDDOGli9ZXYKjH17c/KOJtSJ7ttr94x5BFI root@np0005620857.novalocal
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: The key's randomart image is:
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: +---[RSA 3072]----+
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |                 |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |          E      |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |         .       |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |   =    . .      |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |  o = =oSo .     |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: | o  oX+*o.  .    |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |. .o+*B.= +.     |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |   .o+=ooO.+.    |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |  ...++=++=oo    |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: +----[SHA256]-----+
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: Generating public/private ecdsa key pair.
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: The key fingerprint is:
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: SHA256:8FNtryl+faUogb797dxmo244b+NbrekP0jK8cwfSJlY root@np0005620857.novalocal
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: The key's randomart image is:
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: +---[ECDSA 256]---+
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |                 |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |           .     |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |      .   . o    |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |       o . . .E  |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |        S.   o.  |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |        ....+++ o|
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |       .  .oO*o+o|
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |        .o.++@=*=|
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |        ..o+X@@*+|
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: +----[SHA256]-----+
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: Generating public/private ed25519 key pair.
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: The key fingerprint is:
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: SHA256:ZtwDcdKp/4mae2J7C2luJ9BppjV5BvIeaBBFSKkb8Lg root@np0005620857.novalocal
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: The key's randomart image is:
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: +--[ED25519 256]--+
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |   ..=o o...     |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |.   +    +o      |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: | + . .  ..       |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |. + . ..oo       |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: | . o . =S=o      |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |E .   +o@.+.     |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |     . B+= o .   |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |      .o*o= o    |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: |       o*X..     |
Feb 16 12:12:03 np0005620857.novalocal cloud-init[932]: +----[SHA256]-----+
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Reached target Network is Online.
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Starting System Logging Service...
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Starting Permit User Sessions...
Feb 16 12:12:03 np0005620857.novalocal sm-notify[1015]: Version 2.5.4 starting
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 16 12:12:03 np0005620857.novalocal sshd[1017]: Server listening on 0.0.0.0 port 22.
Feb 16 12:12:03 np0005620857.novalocal sshd[1017]: Server listening on :: port 22.
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Finished Permit User Sessions.
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Started Command Scheduler.
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Started Getty on tty1.
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 16 12:12:03 np0005620857.novalocal crond[1020]: (CRON) STARTUP (1.5.7)
Feb 16 12:12:03 np0005620857.novalocal crond[1020]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Reached target Login Prompts.
Feb 16 12:12:03 np0005620857.novalocal crond[1020]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 71% if used.)
Feb 16 12:12:03 np0005620857.novalocal crond[1020]: (CRON) INFO (running with inotify support)
Feb 16 12:12:03 np0005620857.novalocal rsyslogd[1016]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1016" x-info="https://www.rsyslog.com"] start
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Started System Logging Service.
Feb 16 12:12:03 np0005620857.novalocal rsyslogd[1016]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb 16 12:12:03 np0005620857.novalocal systemd[1]: Reached target Multi-User System.
Feb 16 12:12:04 np0005620857.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 16 12:12:04 np0005620857.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 16 12:12:04 np0005620857.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 16 12:12:04 np0005620857.novalocal sshd-session[1071]: Unable to negotiate with 38.102.83.114 port 58608: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 16 12:12:04 np0005620857.novalocal rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 12:12:04 np0005620857.novalocal sshd-session[1098]: Unable to negotiate with 38.102.83.114 port 58626: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 16 12:12:04 np0005620857.novalocal sshd-session[1110]: Unable to negotiate with 38.102.83.114 port 58638: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 16 12:12:04 np0005620857.novalocal sshd-session[1042]: Connection closed by 38.102.83.114 port 52436 [preauth]
Feb 16 12:12:04 np0005620857.novalocal sshd-session[1150]: Unable to negotiate with 38.102.83.114 port 58670: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Feb 16 12:12:04 np0005620857.novalocal sshd-session[1157]: Unable to negotiate with 38.102.83.114 port 58674: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 16 12:12:04 np0005620857.novalocal sshd-session[1083]: Connection closed by 38.102.83.114 port 58624 [preauth]
Feb 16 12:12:04 np0005620857.novalocal sshd-session[1122]: Connection closed by 38.102.83.114 port 58644 [preauth]
Feb 16 12:12:04 np0005620857.novalocal sshd-session[1135]: Connection closed by 38.102.83.114 port 58656 [preauth]
Feb 16 12:12:04 np0005620857.novalocal kdumpctl[1025]: kdump: No kdump initial ramdisk found.
Feb 16 12:12:04 np0005620857.novalocal kdumpctl[1025]: kdump: Rebuilding /boot/initramfs-5.14.0-677.el9.x86_64kdump.img
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1194]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 16 Feb 2026 12:12:04 +0000. Up 8.71 seconds.
Feb 16 12:12:04 np0005620857.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Feb 16 12:12:04 np0005620857.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1498]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 16 Feb 2026 12:12:04 +0000. Up 9.10 seconds.
Feb 16 12:12:04 np0005620857.novalocal dracut[1521]: dracut-057-110.git20260130.el9
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1526]: #############################################################
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1533]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1541]: 256 SHA256:8FNtryl+faUogb797dxmo244b+NbrekP0jK8cwfSJlY root@np0005620857.novalocal (ECDSA)
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1543]: 256 SHA256:ZtwDcdKp/4mae2J7C2luJ9BppjV5BvIeaBBFSKkb8Lg root@np0005620857.novalocal (ED25519)
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1545]: 3072 SHA256:b4KJbfJ6tDDOGli9ZXYKjH17c/KOJtSJ7ttr94x5BFI root@np0005620857.novalocal (RSA)
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1546]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1547]: #############################################################
Feb 16 12:12:04 np0005620857.novalocal cloud-init[1498]: Cloud-init v. 24.4-8.el9 finished at Mon, 16 Feb 2026 12:12:04 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.27 seconds
Feb 16 12:12:04 np0005620857.novalocal dracut[1524]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-677.el9.x86_64kdump.img 5.14.0-677.el9.x86_64
Feb 16 12:12:04 np0005620857.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Feb 16 12:12:04 np0005620857.novalocal systemd[1]: Reached target Cloud-init target.
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: Module 'resume' will not be installed, because it's in the list to be omitted!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: memstrack is not available
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 16 12:12:05 np0005620857.novalocal dracut[1524]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: memstrack is not available
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: *** Including module: systemd ***
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: *** Including module: fips ***
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: *** Including module: systemd-initrd ***
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: *** Including module: i18n ***
Feb 16 12:12:06 np0005620857.novalocal dracut[1524]: *** Including module: drm ***
Feb 16 12:12:07 np0005620857.novalocal chronyd[801]: Selected source 158.69.193.108 (2.centos.pool.ntp.org)
Feb 16 12:12:07 np0005620857.novalocal chronyd[801]: System clock TAI offset set to 37 seconds
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]: *** Including module: prefixdevname ***
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]: *** Including module: kernel-modules ***
Feb 16 12:12:07 np0005620857.novalocal kernel: block vda: the capability attribute has been deprecated.
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]: *** Including module: kernel-modules-extra ***
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]: *** Including module: qemu ***
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]: *** Including module: fstab-sys ***
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]: *** Including module: rootfs-block ***
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]: *** Including module: terminfo ***
Feb 16 12:12:07 np0005620857.novalocal dracut[1524]: *** Including module: udev-rules ***
Feb 16 12:12:08 np0005620857.novalocal dracut[1524]: Skipping udev rule: 91-permissions.rules
Feb 16 12:12:08 np0005620857.novalocal dracut[1524]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 16 12:12:08 np0005620857.novalocal dracut[1524]: *** Including module: virtiofs ***
Feb 16 12:12:08 np0005620857.novalocal dracut[1524]: *** Including module: dracut-systemd ***
Feb 16 12:12:08 np0005620857.novalocal dracut[1524]: *** Including module: usrmount ***
Feb 16 12:12:08 np0005620857.novalocal dracut[1524]: *** Including module: base ***
Feb 16 12:12:08 np0005620857.novalocal dracut[1524]: *** Including module: fs-lib ***
Feb 16 12:12:08 np0005620857.novalocal dracut[1524]: *** Including module: kdumpbase ***
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:   microcode_ctl module: mangling fw_dir
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]: *** Including module: openssl ***
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]: *** Including module: shutdown ***
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]: *** Including module: squash ***
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]: *** Including modules done ***
Feb 16 12:12:09 np0005620857.novalocal dracut[1524]: *** Installing kernel module dependencies ***
Feb 16 12:12:10 np0005620857.novalocal dracut[1524]: *** Installing kernel module dependencies done ***
Feb 16 12:12:10 np0005620857.novalocal dracut[1524]: *** Resolving executable dependencies ***
Feb 16 12:12:11 np0005620857.novalocal chronyd[801]: Selected source 209.227.173.244 (2.centos.pool.ntp.org)
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: Cannot change IRQ 25 affinity: Operation not permitted
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: IRQ 25 affinity is now unmanaged
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: Cannot change IRQ 31 affinity: Operation not permitted
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: IRQ 31 affinity is now unmanaged
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: Cannot change IRQ 28 affinity: Operation not permitted
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: IRQ 28 affinity is now unmanaged
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: Cannot change IRQ 32 affinity: Operation not permitted
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: IRQ 32 affinity is now unmanaged
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: Cannot change IRQ 30 affinity: Operation not permitted
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: IRQ 30 affinity is now unmanaged
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: Cannot change IRQ 29 affinity: Operation not permitted
Feb 16 12:12:11 np0005620857.novalocal irqbalance[814]: IRQ 29 affinity is now unmanaged
Feb 16 12:12:11 np0005620857.novalocal dracut[1524]: *** Resolving executable dependencies done ***
Feb 16 12:12:11 np0005620857.novalocal dracut[1524]: *** Generating early-microcode cpio image ***
Feb 16 12:12:11 np0005620857.novalocal dracut[1524]: *** Store current command line parameters ***
Feb 16 12:12:11 np0005620857.novalocal dracut[1524]: Stored kernel commandline:
Feb 16 12:12:11 np0005620857.novalocal dracut[1524]: No dracut internal kernel commandline stored in the initramfs
Feb 16 12:12:11 np0005620857.novalocal dracut[1524]: *** Install squash loader ***
Feb 16 12:12:12 np0005620857.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 12:12:12 np0005620857.novalocal dracut[1524]: *** Squashing the files inside the initramfs ***
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: *** Squashing the files inside the initramfs done ***
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: *** Creating image file '/boot/initramfs-5.14.0-677.el9.x86_64kdump.img' ***
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: *** Hardlinking files ***
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: Mode:           real
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: Files:          50
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: Linked:         0 files
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: Compared:       0 xattrs
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: Compared:       0 files
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: Saved:          0 B
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: Duration:       0.000325 seconds
Feb 16 12:12:13 np0005620857.novalocal dracut[1524]: *** Hardlinking files done ***
Feb 16 12:12:17 np0005620857.novalocal dracut[1524]: *** Creating initramfs image file '/boot/initramfs-5.14.0-677.el9.x86_64kdump.img' done ***
Feb 16 12:12:23 np0005620857.novalocal kdumpctl[1025]: kdump: kexec: loaded kdump kernel
Feb 16 12:12:23 np0005620857.novalocal kdumpctl[1025]: kdump: Starting kdump: [OK]
Feb 16 12:12:23 np0005620857.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 16 12:12:23 np0005620857.novalocal systemd[1]: Startup finished in 1.288s (kernel) + 2.371s (initrd) + 24.058s (userspace) = 27.718s.
Feb 16 12:12:26 np0005620857.novalocal sshd-session[4795]: Invalid user zabbix from 104.248.93.62 port 55182
Feb 16 12:12:27 np0005620857.novalocal sshd-session[4795]: Connection closed by invalid user zabbix 104.248.93.62 port 55182 [preauth]
Feb 16 12:12:32 np0005620857.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 12:13:01 np0005620857.novalocal sshd-session[4799]: Connection closed by 142.93.238.36 port 53114
Feb 16 12:13:11 np0005620857.novalocal sshd-session[4800]: Invalid user zabbix from 104.248.93.62 port 41660
Feb 16 12:13:11 np0005620857.novalocal sshd-session[4800]: Connection closed by invalid user zabbix 104.248.93.62 port 41660 [preauth]
Feb 16 12:13:16 np0005620857.novalocal sshd-session[4802]: Accepted publickey for zuul from 38.102.83.114 port 53776 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 16 12:13:16 np0005620857.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 16 12:13:16 np0005620857.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 16 12:13:16 np0005620857.novalocal systemd-logind[821]: New session 1 of user zuul.
Feb 16 12:13:16 np0005620857.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 16 12:13:16 np0005620857.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 16 12:13:16 np0005620857.novalocal systemd[4806]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Queued start job for default target Main User Target.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Created slice User Application Slice.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Reached target Paths.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Reached target Timers.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Starting D-Bus User Message Bus Socket...
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Starting Create User's Volatile Files and Directories...
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Finished Create User's Volatile Files and Directories.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Listening on D-Bus User Message Bus Socket.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Reached target Sockets.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Reached target Basic System.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Reached target Main User Target.
Feb 16 12:13:17 np0005620857.novalocal systemd[4806]: Startup finished in 165ms.
Feb 16 12:13:17 np0005620857.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 16 12:13:17 np0005620857.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 16 12:13:17 np0005620857.novalocal sshd-session[4802]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:13:17 np0005620857.novalocal python3[4888]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:13:24 np0005620857.novalocal python3[4916]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:13:30 np0005620857.novalocal python3[4974]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:13:30 np0005620857.novalocal python3[5014]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 16 12:13:32 np0005620857.novalocal python3[5040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDEJ+pe7LiJg9hVdoAKQ3i1qJF3L7BdinKweX4yR5h1ilsosYkEGsDObr6Mpln9z+TBza2qi/QM4ApOm8G9IXqCNzAFmiKaxc3hoc/d9IsH/s3MeDHtpQAnQlqE3Y5hs6PeHJyA3ivsTX406B10I94prnPs/s5E27DC1AMcOxTN942G4U4Bcrzq5z7IODNs1GO0RoIhIf5ineLLFextnNL5bj71Qr4cTvgeBJRky09Csj3rsTUdu9QQ25yMqYYyv+BgrbPJU3OeGEJIpbnVuBGqfnF/+wei2jqiSFyzTLwwz0pLrLOH1/y+JWNWgsbYM95pE8noPRZQBddrESG9zAXy/IhEiYE6GfbucJDR/S05liuwLOtHmw29SVPzkQxae1eI3xZVS2rlxNQ5W09IDsZs9ZSch4KeHJ9YYRBZpdV1tyjMZ6KZrp+YvMaTfUwmPxbZzVfubTQRxAXPWqWxk8yJdz9eys5mQIcsSECfqnPPAC5gYiXrQErNf7T1DXAX4Dc= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:33 np0005620857.novalocal python3[5064]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:34 np0005620857.novalocal python3[5163]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:34 np0005620857.novalocal python3[5234]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771244013.9432178-230-152091710378131/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=7ba9b203e349477e959408b698d6a67f_id_rsa follow=False checksum=8844458f19d0dfd440f6deb91fc2a8ce557db95a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:35 np0005620857.novalocal python3[5357]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:35 np0005620857.novalocal python3[5428]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771244014.8661394-274-220084924894347/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=7ba9b203e349477e959408b698d6a67f_id_rsa.pub follow=False checksum=9d5f0045bc91051867707ffccb8d9c1ad0378fd4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:36 np0005620857.novalocal python3[5476]: ansible-ping Invoked with data=pong
Feb 16 12:13:38 np0005620857.novalocal python3[5500]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:13:41 np0005620857.novalocal python3[5558]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 16 12:13:42 np0005620857.novalocal python3[5590]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:42 np0005620857.novalocal python3[5614]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:42 np0005620857.novalocal python3[5638]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:43 np0005620857.novalocal python3[5662]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:43 np0005620857.novalocal python3[5686]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:43 np0005620857.novalocal python3[5710]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:45 np0005620857.novalocal sudo[5734]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spxdgkodmxgnjonfvnolefxszxfrmjar ; /usr/bin/python3'
Feb 16 12:13:45 np0005620857.novalocal sudo[5734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:45 np0005620857.novalocal python3[5736]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:45 np0005620857.novalocal sudo[5734]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:45 np0005620857.novalocal sudo[5812]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzbevtkuouobcyptclutepnooxpvqfvh ; /usr/bin/python3'
Feb 16 12:13:45 np0005620857.novalocal sudo[5812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:45 np0005620857.novalocal python3[5814]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:45 np0005620857.novalocal sudo[5812]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:46 np0005620857.novalocal sudo[5885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xztoxugdzwifkotatmnrfvzvxqbxsxyc ; /usr/bin/python3'
Feb 16 12:13:46 np0005620857.novalocal sudo[5885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:46 np0005620857.novalocal python3[5887]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244025.538214-28-21304401712882/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:46 np0005620857.novalocal sudo[5885]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:47 np0005620857.novalocal python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:47 np0005620857.novalocal python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:47 np0005620857.novalocal python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:47 np0005620857.novalocal python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:48 np0005620857.novalocal python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:48 np0005620857.novalocal python3[6055]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:48 np0005620857.novalocal python3[6079]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:49 np0005620857.novalocal python3[6103]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:49 np0005620857.novalocal python3[6127]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:49 np0005620857.novalocal python3[6151]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:49 np0005620857.novalocal python3[6175]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:50 np0005620857.novalocal python3[6199]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:50 np0005620857.novalocal python3[6223]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:50 np0005620857.novalocal python3[6247]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:50 np0005620857.novalocal python3[6271]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:51 np0005620857.novalocal python3[6295]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:51 np0005620857.novalocal python3[6319]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:51 np0005620857.novalocal python3[6343]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:51 np0005620857.novalocal python3[6367]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:52 np0005620857.novalocal python3[6391]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:52 np0005620857.novalocal python3[6415]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:52 np0005620857.novalocal python3[6439]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:52 np0005620857.novalocal python3[6464]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:53 np0005620857.novalocal python3[6489]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:53 np0005620857.novalocal python3[6513]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:53 np0005620857.novalocal sshd-session[6453]: Invalid user zabbix from 104.248.93.62 port 34612
Feb 16 12:13:53 np0005620857.novalocal python3[6537]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:13:53 np0005620857.novalocal sshd-session[6453]: Connection closed by invalid user zabbix 104.248.93.62 port 34612 [preauth]
Feb 16 12:13:56 np0005620857.novalocal sudo[6561]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlrnejfmbbxttxcrqadkbmubljjridbr ; /usr/bin/python3'
Feb 16 12:13:56 np0005620857.novalocal sudo[6561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:57 np0005620857.novalocal python3[6563]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 16 12:13:57 np0005620857.novalocal systemd[1]: Starting Time & Date Service...
Feb 16 12:13:57 np0005620857.novalocal systemd[1]: Started Time & Date Service.
Feb 16 12:13:57 np0005620857.novalocal systemd-timedated[6565]: Changed time zone to 'UTC' (UTC).
Feb 16 12:13:57 np0005620857.novalocal sudo[6561]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:57 np0005620857.novalocal sudo[6592]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggahticvrdufrnrrozppofvcyhiwciew ; /usr/bin/python3'
Feb 16 12:13:57 np0005620857.novalocal sudo[6592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:13:57 np0005620857.novalocal python3[6594]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:57 np0005620857.novalocal sudo[6592]: pam_unix(sudo:session): session closed for user root
Feb 16 12:13:58 np0005620857.novalocal python3[6670]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:58 np0005620857.novalocal python3[6741]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771244038.0032358-203-273396913037390/source _original_basename=tmppmrdybk8 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:13:59 np0005620857.novalocal python3[6841]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:13:59 np0005620857.novalocal python3[6912]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771244038.812212-243-56920209690708/source _original_basename=tmpjkodk9ll follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:00 np0005620857.novalocal sudo[7012]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxidpfngjnanfbcewlotrzxeuxxdqrdw ; /usr/bin/python3'
Feb 16 12:14:00 np0005620857.novalocal sudo[7012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:00 np0005620857.novalocal python3[7014]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:14:00 np0005620857.novalocal sudo[7012]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:00 np0005620857.novalocal sudo[7085]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcsygmtdeycvmxpjmxxeonvmcmkovuvz ; /usr/bin/python3'
Feb 16 12:14:00 np0005620857.novalocal sudo[7085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:00 np0005620857.novalocal python3[7087]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771244040.0598733-307-102394752418816/source _original_basename=tmpzgs1al80 follow=False checksum=b61b8b67cbeabdb25607a6c3ed0750848521994a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:00 np0005620857.novalocal sudo[7085]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:01 np0005620857.novalocal python3[7135]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:14:01 np0005620857.novalocal python3[7161]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:14:01 np0005620857.novalocal sudo[7239]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yveqvpdvkpholosjbzupnnedvzouwgqh ; /usr/bin/python3'
Feb 16 12:14:01 np0005620857.novalocal sudo[7239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:01 np0005620857.novalocal python3[7241]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:14:01 np0005620857.novalocal sudo[7239]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:02 np0005620857.novalocal sudo[7312]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltmorvsalfhgtgugvtzlipsuhjtlusjg ; /usr/bin/python3'
Feb 16 12:14:02 np0005620857.novalocal sudo[7312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:02 np0005620857.novalocal python3[7314]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244041.6859834-363-210910216188548/source _original_basename=tmp2u0hafdl follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:02 np0005620857.novalocal sudo[7312]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:02 np0005620857.novalocal sudo[7363]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpergiuysoaydnnbeiucqfspgygsyvyc ; /usr/bin/python3'
Feb 16 12:14:02 np0005620857.novalocal sudo[7363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:02 np0005620857.novalocal python3[7365]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-8596-46eb-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:14:02 np0005620857.novalocal sudo[7363]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:03 np0005620857.novalocal python3[7393]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-8596-46eb-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 16 12:14:04 np0005620857.novalocal python3[7421]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:11 np0005620857.novalocal irqbalance[814]: Cannot change IRQ 26 affinity: Operation not permitted
Feb 16 12:14:11 np0005620857.novalocal irqbalance[814]: IRQ 26 affinity is now unmanaged
Feb 16 12:14:22 np0005620857.novalocal sudo[7445]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnlskfuoqrkteoiieiuwwccvgrytsdre ; /usr/bin/python3'
Feb 16 12:14:22 np0005620857.novalocal sudo[7445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:14:22 np0005620857.novalocal python3[7447]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:14:22 np0005620857.novalocal sudo[7445]: pam_unix(sudo:session): session closed for user root
Feb 16 12:14:27 np0005620857.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 16 12:14:36 np0005620857.novalocal sshd-session[7450]: Invalid user zabbix from 104.248.93.62 port 35214
Feb 16 12:14:36 np0005620857.novalocal sshd-session[7450]: Connection closed by invalid user zabbix 104.248.93.62 port 35214 [preauth]
Feb 16 12:14:57 np0005620857.novalocal sshd-session[7452]: Connection closed by authenticating user root 142.93.238.36 port 38702 [preauth]
Feb 16 12:15:19 np0005620857.novalocal sshd-session[7454]: Invalid user zabbix from 104.248.93.62 port 44602
Feb 16 12:15:19 np0005620857.novalocal sshd-session[7454]: Connection closed by invalid user zabbix 104.248.93.62 port 44602 [preauth]
Feb 16 12:15:22 np0005620857.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 16 12:15:22 np0005620857.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb 16 12:15:22 np0005620857.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb 16 12:15:22 np0005620857.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb 16 12:15:22 np0005620857.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb 16 12:15:22 np0005620857.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb 16 12:15:22 np0005620857.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb 16 12:15:22 np0005620857.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb 16 12:15:22 np0005620857.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb 16 12:15:22 np0005620857.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1435] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 16 12:15:22 np0005620857.novalocal systemd-udevd[7456]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1633] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1668] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1672] device (eth1): carrier: link connected
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1675] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1681] policy: auto-activating connection 'Wired connection 1' (044b5d65-36c8-3a82-aeee-bbb48ac5f904)
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1686] device (eth1): Activation: starting connection 'Wired connection 1' (044b5d65-36c8-3a82-aeee-bbb48ac5f904)
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1687] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1691] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1697] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 12:15:22 np0005620857.novalocal NetworkManager[869]: <info>  [1771244122.1701] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:15:22 np0005620857.novalocal systemd[4806]: Starting Mark boot as successful...
Feb 16 12:15:22 np0005620857.novalocal systemd[4806]: Finished Mark boot as successful.
Feb 16 12:15:23 np0005620857.novalocal sshd-session[4815]: Received disconnect from 38.102.83.114 port 53776:11: disconnected by user
Feb 16 12:15:23 np0005620857.novalocal sshd-session[4815]: Disconnected from user zuul 38.102.83.114 port 53776
Feb 16 12:15:23 np0005620857.novalocal sshd-session[4802]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:15:23 np0005620857.novalocal systemd-logind[821]: Session 1 logged out. Waiting for processes to exit.
Feb 16 12:15:23 np0005620857.novalocal sshd-session[7461]: Accepted publickey for zuul from 38.102.83.114 port 54884 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:15:23 np0005620857.novalocal systemd-logind[821]: New session 3 of user zuul.
Feb 16 12:15:23 np0005620857.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 16 12:15:23 np0005620857.novalocal sshd-session[7461]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:15:23 np0005620857.novalocal python3[7488]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-886a-88de-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:15:30 np0005620857.novalocal sudo[7566]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgjhndqvhhjgvbaimaxtvxbunwgeolza ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:15:30 np0005620857.novalocal sudo[7566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:15:30 np0005620857.novalocal python3[7568]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:15:30 np0005620857.novalocal sudo[7566]: pam_unix(sudo:session): session closed for user root
Feb 16 12:15:30 np0005620857.novalocal sudo[7639]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpmhnzleclukgzmjvvvpzdhisimhbquw ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:15:30 np0005620857.novalocal sudo[7639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:15:30 np0005620857.novalocal python3[7641]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771244130.269537-154-249712851246913/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=648c4c00a1d627f94c97b1205e07f3752261ba89 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:15:30 np0005620857.novalocal sudo[7639]: pam_unix(sudo:session): session closed for user root
Feb 16 12:15:31 np0005620857.novalocal sudo[7689]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaarwefwkghznccdolwsvfsywkhiqheo ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:15:31 np0005620857.novalocal sudo[7689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:15:31 np0005620857.novalocal python3[7691]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Stopping Network Manager...
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[869]: <info>  [1771244131.3891] caught SIGTERM, shutting down normally.
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[869]: <info>  [1771244131.3905] dhcp4 (eth0): canceled DHCP transaction
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[869]: <info>  [1771244131.3905] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[869]: <info>  [1771244131.3905] dhcp4 (eth0): state changed no lease
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[869]: <info>  [1771244131.3909] manager: NetworkManager state is now CONNECTING
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[869]: <info>  [1771244131.4021] dhcp4 (eth1): canceled DHCP transaction
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[869]: <info>  [1771244131.4022] dhcp4 (eth1): state changed no lease
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[869]: <info>  [1771244131.4101] exiting (success)
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Stopped Network Manager.
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: NetworkManager.service: Consumed 1.675s CPU time, 10.2M memory peak.
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Starting Network Manager...
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.4730] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:cd836bab-140a-4a06-bcbf-b453ec38ea52)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.4734] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.4795] manager[0x562ae5180000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Starting Hostname Service...
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Started Hostname Service.
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5832] hostname: hostname: using hostnamed
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5833] hostname: static hostname changed from (none) to "np0005620857.novalocal"
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5838] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5843] manager[0x562ae5180000]: rfkill: Wi-Fi hardware radio set enabled
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5844] manager[0x562ae5180000]: rfkill: WWAN hardware radio set enabled
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5875] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5876] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5877] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5878] manager: Networking is enabled by state file
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5881] settings: Loaded settings plugin: keyfile (internal)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5887] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5919] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5928] dhcp: init: Using DHCP client 'internal'
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5932] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5937] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5943] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5951] device (lo): Activation: starting connection 'lo' (1e10248c-d525-48d3-b66b-d34bc8862c9f)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5957] device (eth0): carrier: link connected
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5962] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5967] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5968] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5973] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5980] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5985] device (eth1): carrier: link connected
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5989] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5994] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (044b5d65-36c8-3a82-aeee-bbb48ac5f904) (indicated)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.5995] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6001] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6008] device (eth1): Activation: starting connection 'Wired connection 1' (044b5d65-36c8-3a82-aeee-bbb48ac5f904)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6016] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Started Network Manager.
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6022] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6024] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6027] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6029] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6034] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6044] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6048] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6052] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6061] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6064] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6075] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6079] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6097] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6098] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6104] device (lo): Activation: successful, device activated.
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6119] dhcp4 (eth0): state changed new lease, address=38.102.83.251
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6124] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 16 12:15:31 np0005620857.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6250] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6266] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6267] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6271] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6275] device (eth0): Activation: successful, device activated.
Feb 16 12:15:31 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244131.6281] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 16 12:15:31 np0005620857.novalocal sudo[7689]: pam_unix(sudo:session): session closed for user root
Feb 16 12:15:31 np0005620857.novalocal python3[7775]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-886a-88de-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:15:41 np0005620857.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 12:16:01 np0005620857.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 12:16:02 np0005620857.novalocal sshd-session[7780]: Invalid user zabbix from 104.248.93.62 port 41426
Feb 16 12:16:02 np0005620857.novalocal sshd-session[7780]: Connection closed by invalid user zabbix 104.248.93.62 port 41426 [preauth]
Feb 16 12:16:08 np0005620857.novalocal sshd-session[7782]: Connection closed by authenticating user root 142.93.238.36 port 36296 [preauth]
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5257] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 12:16:16 np0005620857.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 12:16:16 np0005620857.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5588] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5591] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5597] device (eth1): Activation: successful, device activated.
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5605] manager: startup complete
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5607] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <warn>  [1771244176.5611] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5618] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb 16 12:16:16 np0005620857.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5791] dhcp4 (eth1): canceled DHCP transaction
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5794] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5794] dhcp4 (eth1): state changed no lease
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5817] policy: auto-activating connection 'ci-private-network' (648a3a79-0232-5a2f-bab7-8580a0ffce3b)
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5825] device (eth1): Activation: starting connection 'ci-private-network' (648a3a79-0232-5a2f-bab7-8580a0ffce3b)
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5827] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5833] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5844] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5858] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5911] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5913] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 12:16:16 np0005620857.novalocal NetworkManager[7706]: <info>  [1771244176.5922] device (eth1): Activation: successful, device activated.
Feb 16 12:16:26 np0005620857.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 12:16:32 np0005620857.novalocal sshd-session[7464]: Received disconnect from 38.102.83.114 port 54884:11: disconnected by user
Feb 16 12:16:32 np0005620857.novalocal sshd-session[7464]: Disconnected from user zuul 38.102.83.114 port 54884
Feb 16 12:16:32 np0005620857.novalocal sshd-session[7461]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:16:32 np0005620857.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 16 12:16:32 np0005620857.novalocal systemd[1]: session-3.scope: Consumed 1.448s CPU time.
Feb 16 12:16:32 np0005620857.novalocal systemd-logind[821]: Session 3 logged out. Waiting for processes to exit.
Feb 16 12:16:32 np0005620857.novalocal systemd-logind[821]: Removed session 3.
Feb 16 12:16:32 np0005620857.novalocal sshd-session[7808]: Accepted publickey for zuul from 38.102.83.114 port 45774 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:16:32 np0005620857.novalocal systemd-logind[821]: New session 4 of user zuul.
Feb 16 12:16:32 np0005620857.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 16 12:16:32 np0005620857.novalocal sshd-session[7808]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:16:33 np0005620857.novalocal sudo[7887]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrqlkcsnuddteizencvgsvwgwsnnoegr ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:16:33 np0005620857.novalocal sudo[7887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:16:33 np0005620857.novalocal python3[7889]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:16:33 np0005620857.novalocal sudo[7887]: pam_unix(sudo:session): session closed for user root
Feb 16 12:16:33 np0005620857.novalocal sudo[7960]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocbeiloftfknstwbloiqpevwvahprhzc ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 12:16:33 np0005620857.novalocal sudo[7960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:16:33 np0005620857.novalocal python3[7962]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244193.0685046-312-73915508203992/source _original_basename=tmpt_pa27s2 follow=False checksum=0ef1aafc1c85a846b4f84c91412dbcb842d2a2f1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:16:33 np0005620857.novalocal sudo[7960]: pam_unix(sudo:session): session closed for user root
Feb 16 12:16:35 np0005620857.novalocal sshd-session[7811]: Connection closed by 38.102.83.114 port 45774
Feb 16 12:16:35 np0005620857.novalocal sshd-session[7808]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:16:35 np0005620857.novalocal systemd-logind[821]: Session 4 logged out. Waiting for processes to exit.
Feb 16 12:16:35 np0005620857.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 16 12:16:35 np0005620857.novalocal systemd-logind[821]: Removed session 4.
Feb 16 12:16:46 np0005620857.novalocal sshd-session[7987]: Invalid user zabbix from 104.248.93.62 port 33380
Feb 16 12:16:46 np0005620857.novalocal sshd-session[7987]: Connection closed by invalid user zabbix 104.248.93.62 port 33380 [preauth]
Feb 16 12:17:26 np0005620857.novalocal sshd-session[7989]: Connection closed by authenticating user root 142.93.238.36 port 36182 [preauth]
Feb 16 12:17:29 np0005620857.novalocal sshd-session[7991]: Invalid user zabbix from 104.248.93.62 port 40432
Feb 16 12:17:29 np0005620857.novalocal sshd-session[7991]: Connection closed by invalid user zabbix 104.248.93.62 port 40432 [preauth]
Feb 16 12:18:10 np0005620857.novalocal sshd-session[7993]: Invalid user hadoop from 104.248.93.62 port 57726
Feb 16 12:18:10 np0005620857.novalocal sshd-session[7993]: Connection closed by invalid user hadoop 104.248.93.62 port 57726 [preauth]
Feb 16 12:18:40 np0005620857.novalocal sshd-session[7996]: Connection closed by authenticating user root 142.93.238.36 port 54158 [preauth]
Feb 16 12:18:43 np0005620857.novalocal systemd[4806]: Created slice User Background Tasks Slice.
Feb 16 12:18:43 np0005620857.novalocal systemd[4806]: Starting Cleanup of User's Temporary Files and Directories...
Feb 16 12:18:43 np0005620857.novalocal systemd[4806]: Finished Cleanup of User's Temporary Files and Directories.
Feb 16 12:18:49 np0005620857.novalocal sshd-session[7999]: Invalid user hadoop from 104.248.93.62 port 47496
Feb 16 12:18:49 np0005620857.novalocal sshd-session[7999]: Connection closed by invalid user hadoop 104.248.93.62 port 47496 [preauth]
Feb 16 12:19:30 np0005620857.novalocal sshd-session[8001]: Invalid user hadoop from 104.248.93.62 port 48276
Feb 16 12:19:30 np0005620857.novalocal sshd-session[8001]: Connection closed by invalid user hadoop 104.248.93.62 port 48276 [preauth]
Feb 16 12:19:53 np0005620857.novalocal sshd-session[8003]: Connection closed by authenticating user root 142.93.238.36 port 40434 [preauth]
Feb 16 12:20:10 np0005620857.novalocal sshd-session[8005]: Invalid user hadoop from 104.248.93.62 port 36898
Feb 16 12:20:10 np0005620857.novalocal sshd-session[8005]: Connection closed by invalid user hadoop 104.248.93.62 port 36898 [preauth]
Feb 16 12:20:52 np0005620857.novalocal sshd-session[8007]: Invalid user hadoop from 104.248.93.62 port 58982
Feb 16 12:20:53 np0005620857.novalocal sshd-session[8007]: Connection closed by invalid user hadoop 104.248.93.62 port 58982 [preauth]
Feb 16 12:21:06 np0005620857.novalocal sshd-session[8009]: Connection closed by authenticating user root 142.93.238.36 port 58484 [preauth]
Feb 16 12:21:36 np0005620857.novalocal sshd-session[8011]: Invalid user hadoop from 104.248.93.62 port 60848
Feb 16 12:21:36 np0005620857.novalocal sshd-session[8011]: Connection closed by invalid user hadoop 104.248.93.62 port 60848 [preauth]
Feb 16 12:22:19 np0005620857.novalocal sshd-session[8013]: Connection closed by authenticating user root 142.93.238.36 port 56274 [preauth]
Feb 16 12:22:19 np0005620857.novalocal sshd-session[8016]: Invalid user hadoop from 104.248.93.62 port 38892
Feb 16 12:22:19 np0005620857.novalocal sshd-session[8016]: Connection closed by invalid user hadoop 104.248.93.62 port 38892 [preauth]
Feb 16 12:23:05 np0005620857.novalocal sshd-session[8018]: Invalid user hadoop from 104.248.93.62 port 38334
Feb 16 12:23:05 np0005620857.novalocal sshd-session[8018]: Connection closed by invalid user hadoop 104.248.93.62 port 38334 [preauth]
Feb 16 12:23:33 np0005620857.novalocal sshd-session[8021]: Connection closed by authenticating user root 142.93.238.36 port 35788 [preauth]
Feb 16 12:23:50 np0005620857.novalocal sshd-session[8023]: Invalid user hadoop from 104.248.93.62 port 36930
Feb 16 12:23:50 np0005620857.novalocal sshd-session[8023]: Connection closed by invalid user hadoop 104.248.93.62 port 36930 [preauth]
Feb 16 12:24:35 np0005620857.novalocal sshd-session[8026]: Invalid user hadoop from 104.248.93.62 port 39298
Feb 16 12:24:35 np0005620857.novalocal sshd-session[8026]: Connection closed by invalid user hadoop 104.248.93.62 port 39298 [preauth]
Feb 16 12:24:47 np0005620857.novalocal sshd-session[8028]: Connection closed by authenticating user root 142.93.238.36 port 40314 [preauth]
Feb 16 12:25:18 np0005620857.novalocal sshd-session[8030]: Invalid user hadoop from 104.248.93.62 port 33590
Feb 16 12:25:19 np0005620857.novalocal sshd-session[8030]: Connection closed by invalid user hadoop 104.248.93.62 port 33590 [preauth]
Feb 16 12:26:00 np0005620857.novalocal sshd-session[8032]: Invalid user hadoop from 104.248.93.62 port 42952
Feb 16 12:26:01 np0005620857.novalocal sshd-session[8032]: Connection closed by invalid user hadoop 104.248.93.62 port 42952 [preauth]
Feb 16 12:26:01 np0005620857.novalocal sshd-session[8034]: Connection closed by authenticating user root 142.93.238.36 port 49346 [preauth]
Feb 16 12:26:43 np0005620857.novalocal sshd-session[8036]: Invalid user hadoop from 104.248.93.62 port 33182
Feb 16 12:26:43 np0005620857.novalocal sshd-session[8036]: Connection closed by invalid user hadoop 104.248.93.62 port 33182 [preauth]
Feb 16 12:27:14 np0005620857.novalocal sshd-session[8038]: Connection closed by authenticating user root 142.93.238.36 port 39476 [preauth]
Feb 16 12:27:14 np0005620857.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Feb 16 12:27:14 np0005620857.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 16 12:27:14 np0005620857.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Feb 16 12:27:14 np0005620857.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 16 12:27:25 np0005620857.novalocal sshd-session[8043]: Invalid user mysql from 104.248.93.62 port 58774
Feb 16 12:27:25 np0005620857.novalocal sshd-session[8043]: Connection closed by invalid user mysql 104.248.93.62 port 58774 [preauth]
Feb 16 12:28:06 np0005620857.novalocal sshd-session[8045]: Invalid user mysql from 104.248.93.62 port 50510
Feb 16 12:28:06 np0005620857.novalocal sshd-session[8045]: Connection closed by invalid user mysql 104.248.93.62 port 50510 [preauth]
Feb 16 12:28:10 np0005620857.novalocal sshd-session[8048]: Accepted publickey for zuul from 38.102.83.114 port 47032 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:28:10 np0005620857.novalocal systemd-logind[821]: New session 5 of user zuul.
Feb 16 12:28:10 np0005620857.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 16 12:28:10 np0005620857.novalocal sshd-session[8048]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:28:10 np0005620857.novalocal sudo[8075]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xisotkkyxegxhuboojhwbmhzhcbxgcto ; /usr/bin/python3'
Feb 16 12:28:10 np0005620857.novalocal sudo[8075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:10 np0005620857.novalocal python3[8077]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-94f8-a218-000000000cd9-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:10 np0005620857.novalocal sudo[8075]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:11 np0005620857.novalocal sudo[8104]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iparcqxvglpfkyzzfphgmkcyyxzynozl ; /usr/bin/python3'
Feb 16 12:28:11 np0005620857.novalocal sudo[8104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:11 np0005620857.novalocal python3[8106]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:11 np0005620857.novalocal sudo[8104]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:11 np0005620857.novalocal sudo[8130]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzbdwrvcftbdvokqowadcilwoxjxlxce ; /usr/bin/python3'
Feb 16 12:28:11 np0005620857.novalocal sudo[8130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:11 np0005620857.novalocal python3[8132]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:11 np0005620857.novalocal sudo[8130]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:11 np0005620857.novalocal sudo[8156]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adtyheuvntixzvnpafzsdvtxhlviwbpe ; /usr/bin/python3'
Feb 16 12:28:11 np0005620857.novalocal sudo[8156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:11 np0005620857.novalocal python3[8158]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:11 np0005620857.novalocal sudo[8156]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:11 np0005620857.novalocal sudo[8182]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daadtcpqhlrbthkojiyjknopwcqtxpuq ; /usr/bin/python3'
Feb 16 12:28:11 np0005620857.novalocal sudo[8182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:12 np0005620857.novalocal python3[8184]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:12 np0005620857.novalocal sudo[8182]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:13 np0005620857.novalocal sudo[8208]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkvqjxnqvnnpsaqhhphymlkbrjbrxnrq ; /usr/bin/python3'
Feb 16 12:28:13 np0005620857.novalocal sudo[8208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:13 np0005620857.novalocal python3[8210]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:13 np0005620857.novalocal sudo[8208]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:13 np0005620857.novalocal sudo[8286]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljaerfofmofrpnaqehrzrjeocfxwusxv ; /usr/bin/python3'
Feb 16 12:28:13 np0005620857.novalocal sudo[8286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:13 np0005620857.novalocal python3[8288]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:28:13 np0005620857.novalocal sudo[8286]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:13 np0005620857.novalocal sudo[8359]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkwgrouymdodjsenhkwgmaqdinvgcvos ; /usr/bin/python3'
Feb 16 12:28:13 np0005620857.novalocal sudo[8359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:14 np0005620857.novalocal python3[8361]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244893.530774-379-50798140601543/source _original_basename=tmptijlpj1u follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:28:14 np0005620857.novalocal sudo[8359]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:15 np0005620857.novalocal sudo[8409]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtrymmvgpueztufnelnrvfyqtqhwpnav ; /usr/bin/python3'
Feb 16 12:28:15 np0005620857.novalocal sudo[8409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:15 np0005620857.novalocal python3[8411]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 12:28:15 np0005620857.novalocal systemd[1]: Reloading.
Feb 16 12:28:15 np0005620857.novalocal systemd-rc-local-generator[8429]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:28:15 np0005620857.novalocal sudo[8409]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:16 np0005620857.novalocal sudo[8472]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrvitulfnjddcgrwagvkpzntpvgomygn ; /usr/bin/python3'
Feb 16 12:28:17 np0005620857.novalocal sudo[8472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:17 np0005620857.novalocal python3[8474]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 16 12:28:17 np0005620857.novalocal sudo[8472]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:17 np0005620857.novalocal sudo[8498]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blrhymgmjmgybptqjjoxfetwyxoetycj ; /usr/bin/python3'
Feb 16 12:28:17 np0005620857.novalocal sudo[8498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:17 np0005620857.novalocal python3[8500]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:17 np0005620857.novalocal sudo[8498]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:17 np0005620857.novalocal sudo[8526]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odhjlgvlwvhxrzaolwotxvyvlpzqsnas ; /usr/bin/python3'
Feb 16 12:28:17 np0005620857.novalocal sudo[8526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:17 np0005620857.novalocal python3[8528]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:17 np0005620857.novalocal sudo[8526]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:17 np0005620857.novalocal sudo[8554]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grwhtyapddqjqqpgvilqzjarstihbzsj ; /usr/bin/python3'
Feb 16 12:28:17 np0005620857.novalocal sudo[8554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:17 np0005620857.novalocal python3[8556]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:18 np0005620857.novalocal sudo[8554]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:18 np0005620857.novalocal sudo[8582]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzlokjgvcehoqdvlotlegzlyjsjgzzlp ; /usr/bin/python3'
Feb 16 12:28:18 np0005620857.novalocal sudo[8582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:18 np0005620857.novalocal python3[8584]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:18 np0005620857.novalocal sudo[8582]: pam_unix(sudo:session): session closed for user root
Feb 16 12:28:19 np0005620857.novalocal python3[8612]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-94f8-a218-000000000ce0-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:28:19 np0005620857.novalocal python3[8642]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 16 12:28:22 np0005620857.novalocal sshd-session[8051]: Connection closed by 38.102.83.114 port 47032
Feb 16 12:28:22 np0005620857.novalocal sshd-session[8048]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:28:22 np0005620857.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Feb 16 12:28:22 np0005620857.novalocal systemd[1]: session-5.scope: Consumed 3.497s CPU time.
Feb 16 12:28:22 np0005620857.novalocal systemd-logind[821]: Session 5 logged out. Waiting for processes to exit.
Feb 16 12:28:22 np0005620857.novalocal systemd-logind[821]: Removed session 5.
Feb 16 12:28:23 np0005620857.novalocal sshd-session[8647]: Accepted publickey for zuul from 38.102.83.114 port 49254 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:28:23 np0005620857.novalocal systemd-logind[821]: New session 6 of user zuul.
Feb 16 12:28:23 np0005620857.novalocal systemd[1]: Started Session 6 of User zuul.
Feb 16 12:28:23 np0005620857.novalocal sshd-session[8647]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:28:23 np0005620857.novalocal sudo[8674]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cibhamznwrsnsobmuaytfuwfghdrjoek ; /usr/bin/python3'
Feb 16 12:28:23 np0005620857.novalocal sudo[8674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:28:24 np0005620857.novalocal python3[8676]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 16 12:28:28 np0005620857.novalocal sshd-session[8682]: Connection closed by authenticating user root 142.93.238.36 port 54792 [preauth]
Feb 16 12:28:32 np0005620857.novalocal setsebool[8717]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 16 12:28:32 np0005620857.novalocal setsebool[8717]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 16 12:28:45 np0005620857.novalocal kernel: SELinux:  Converting 385 SID table entries...
Feb 16 12:28:45 np0005620857.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 12:28:45 np0005620857.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 16 12:28:45 np0005620857.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 12:28:45 np0005620857.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 16 12:28:45 np0005620857.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 12:28:45 np0005620857.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 12:28:45 np0005620857.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 12:28:50 np0005620857.novalocal sshd-session[8745]: Invalid user mysql from 104.248.93.62 port 60680
Feb 16 12:28:50 np0005620857.novalocal sshd-session[8745]: Connection closed by invalid user mysql 104.248.93.62 port 60680 [preauth]
Feb 16 12:28:57 np0005620857.novalocal kernel: SELinux:  Converting 388 SID table entries...
Feb 16 12:28:57 np0005620857.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 12:28:57 np0005620857.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 16 12:28:57 np0005620857.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 12:28:57 np0005620857.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 16 12:28:57 np0005620857.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 12:28:57 np0005620857.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 12:28:57 np0005620857.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 12:29:15 np0005620857.novalocal dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 16 12:29:15 np0005620857.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 12:29:15 np0005620857.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 16 12:29:15 np0005620857.novalocal systemd[1]: Reloading.
Feb 16 12:29:15 np0005620857.novalocal systemd-rc-local-generator[9509]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:29:16 np0005620857.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 12:29:17 np0005620857.novalocal sudo[8674]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:17 np0005620857.novalocal python3[11616]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-c8c4-2e34-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:29:19 np0005620857.novalocal kernel: evm: overlay not supported
Feb 16 12:29:19 np0005620857.novalocal systemd[4806]: Starting D-Bus User Message Bus...
Feb 16 12:29:19 np0005620857.novalocal dbus-broker-launch[12714]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 16 12:29:19 np0005620857.novalocal dbus-broker-launch[12714]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 16 12:29:19 np0005620857.novalocal systemd[4806]: Started D-Bus User Message Bus.
Feb 16 12:29:19 np0005620857.novalocal dbus-broker-lau[12714]: Ready
Feb 16 12:29:19 np0005620857.novalocal systemd[4806]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 16 12:29:19 np0005620857.novalocal systemd[4806]: Created slice Slice /user.
Feb 16 12:29:19 np0005620857.novalocal systemd[4806]: podman-12604.scope: unit configures an IP firewall, but not running as root.
Feb 16 12:29:19 np0005620857.novalocal systemd[4806]: (This warning is only shown for the first unit using IP firewalling.)
Feb 16 12:29:19 np0005620857.novalocal systemd[4806]: Started podman-12604.scope.
Feb 16 12:29:19 np0005620857.novalocal systemd[4806]: Started podman-pause-8fdd1c9a.scope.
Feb 16 12:29:20 np0005620857.novalocal sudo[13376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqakeaxxkmzknkxzpypwpxhfrzmqerov ; /usr/bin/python3'
Feb 16 12:29:20 np0005620857.novalocal sudo[13376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:20 np0005620857.novalocal python3[13393]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.82:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.82:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:29:20 np0005620857.novalocal python3[13393]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Feb 16 12:29:20 np0005620857.novalocal sudo[13376]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:20 np0005620857.novalocal sshd-session[8650]: Connection closed by 38.102.83.114 port 49254
Feb 16 12:29:20 np0005620857.novalocal sshd-session[8647]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:29:20 np0005620857.novalocal systemd-logind[821]: Session 6 logged out. Waiting for processes to exit.
Feb 16 12:29:20 np0005620857.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Feb 16 12:29:20 np0005620857.novalocal systemd[1]: session-6.scope: Consumed 46.553s CPU time.
Feb 16 12:29:20 np0005620857.novalocal systemd-logind[821]: Removed session 6.
Feb 16 12:29:21 np0005620857.novalocal irqbalance[814]: Cannot change IRQ 27 affinity: Operation not permitted
Feb 16 12:29:21 np0005620857.novalocal irqbalance[814]: IRQ 27 affinity is now unmanaged
Feb 16 12:29:35 np0005620857.novalocal sshd-session[22762]: Invalid user mysql from 104.248.93.62 port 49372
Feb 16 12:29:35 np0005620857.novalocal sshd-session[22762]: Connection closed by invalid user mysql 104.248.93.62 port 49372 [preauth]
Feb 16 12:29:40 np0005620857.novalocal sshd-session[26149]: Connection closed by 38.102.83.173 port 39214 [preauth]
Feb 16 12:29:40 np0005620857.novalocal sshd-session[26151]: Connection closed by 38.102.83.173 port 39230 [preauth]
Feb 16 12:29:40 np0005620857.novalocal sshd-session[26157]: Unable to negotiate with 38.102.83.173 port 39242: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 16 12:29:40 np0005620857.novalocal sshd-session[26150]: Unable to negotiate with 38.102.83.173 port 39252: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 16 12:29:40 np0005620857.novalocal sshd-session[26154]: Unable to negotiate with 38.102.83.173 port 39250: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 16 12:29:43 np0005620857.novalocal sshd-session[27621]: Connection closed by authenticating user root 142.93.238.36 port 55616 [preauth]
Feb 16 12:29:44 np0005620857.novalocal sshd-session[27944]: Accepted publickey for zuul from 38.102.83.114 port 48448 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:29:44 np0005620857.novalocal systemd-logind[821]: New session 7 of user zuul.
Feb 16 12:29:44 np0005620857.novalocal systemd[1]: Started Session 7 of User zuul.
Feb 16 12:29:44 np0005620857.novalocal sshd-session[27944]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:29:44 np0005620857.novalocal python3[28060]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLw5bE4eRRUtvrM6OBDt8NKN02ATHZwzFbB7mSYctuYzb/b/Lpi0o/fihCOZ6zxxurHwzkN/sSjk0NQZ4P2XkSE= zuul@np0005620855.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:29:44 np0005620857.novalocal sudo[28287]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nosnquoztysnpbmcfzsjkohblxicywvc ; /usr/bin/python3'
Feb 16 12:29:44 np0005620857.novalocal sudo[28287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:44 np0005620857.novalocal python3[28299]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLw5bE4eRRUtvrM6OBDt8NKN02ATHZwzFbB7mSYctuYzb/b/Lpi0o/fihCOZ6zxxurHwzkN/sSjk0NQZ4P2XkSE= zuul@np0005620855.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:29:44 np0005620857.novalocal sudo[28287]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:45 np0005620857.novalocal sudo[28706]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnwmnazuznhoxgblylrwuoqumyyginkb ; /usr/bin/python3'
Feb 16 12:29:45 np0005620857.novalocal sudo[28706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:45 np0005620857.novalocal python3[28714]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005620857.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 16 12:29:45 np0005620857.novalocal useradd[28804]: new group: name=cloud-admin, GID=1002
Feb 16 12:29:45 np0005620857.novalocal useradd[28804]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Feb 16 12:29:45 np0005620857.novalocal sudo[28706]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:45 np0005620857.novalocal sudo[28924]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzqxpheixyvxtbwtozhpuettkzsdguna ; /usr/bin/python3'
Feb 16 12:29:45 np0005620857.novalocal sudo[28924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:46 np0005620857.novalocal python3[28938]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLw5bE4eRRUtvrM6OBDt8NKN02ATHZwzFbB7mSYctuYzb/b/Lpi0o/fihCOZ6zxxurHwzkN/sSjk0NQZ4P2XkSE= zuul@np0005620855.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 12:29:46 np0005620857.novalocal sudo[28924]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:46 np0005620857.novalocal sudo[29220]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sboybakghswgpbzcnoqdazdqoccjvuhq ; /usr/bin/python3'
Feb 16 12:29:46 np0005620857.novalocal sudo[29220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:46 np0005620857.novalocal python3[29229]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:29:46 np0005620857.novalocal sudo[29220]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:46 np0005620857.novalocal sudo[29519]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efmcobowupbdrdqebslvktbxwlysralw ; /usr/bin/python3'
Feb 16 12:29:46 np0005620857.novalocal sudo[29519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:46 np0005620857.novalocal python3[29529]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771244986.1693208-152-18916705606968/source _original_basename=tmpkcm9hqxi follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:29:46 np0005620857.novalocal sudo[29519]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:47 np0005620857.novalocal sudo[29856]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmwmqjhdntdpoostsjkpejgjpiahbyyv ; /usr/bin/python3'
Feb 16 12:29:47 np0005620857.novalocal sudo[29856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:29:47 np0005620857.novalocal python3[29862]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Feb 16 12:29:47 np0005620857.novalocal systemd[1]: Starting Hostname Service...
Feb 16 12:29:47 np0005620857.novalocal systemd[1]: Started Hostname Service.
Feb 16 12:29:48 np0005620857.novalocal systemd-hostnamed[29953]: Changed pretty hostname to 'compute-1'
Feb 16 12:29:48 compute-1 systemd-hostnamed[29953]: Hostname set to <compute-1> (static)
Feb 16 12:29:48 compute-1 NetworkManager[7706]: <info>  [1771244988.6955] hostname: static hostname changed from "np0005620857.novalocal" to "compute-1"
Feb 16 12:29:48 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 12:29:48 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 12:29:48 compute-1 sudo[29856]: pam_unix(sudo:session): session closed for user root
Feb 16 12:29:48 compute-1 sshd-session[28000]: Connection closed by 38.102.83.114 port 48448
Feb 16 12:29:48 compute-1 sshd-session[27944]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:29:48 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Feb 16 12:29:48 compute-1 systemd[1]: session-7.scope: Consumed 2.044s CPU time.
Feb 16 12:29:48 compute-1 systemd-logind[821]: Session 7 logged out. Waiting for processes to exit.
Feb 16 12:29:48 compute-1 systemd-logind[821]: Removed session 7.
Feb 16 12:29:49 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 12:29:49 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 16 12:29:49 compute-1 systemd[1]: man-db-cache-update.service: Consumed 36.421s CPU time.
Feb 16 12:29:49 compute-1 systemd[1]: run-ra95fc1698d4a4a6e8f7c36c3e0e9c6fa.service: Deactivated successfully.
Feb 16 12:29:58 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 12:30:18 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 12:30:19 compute-1 sshd-session[30635]: Invalid user mysql from 104.248.93.62 port 36976
Feb 16 12:30:19 compute-1 sshd-session[30635]: Connection closed by invalid user mysql 104.248.93.62 port 36976 [preauth]
Feb 16 12:30:55 compute-1 sshd-session[30637]: Connection closed by authenticating user root 142.93.238.36 port 43308 [preauth]
Feb 16 12:31:05 compute-1 sshd-session[30639]: Invalid user mysql from 104.248.93.62 port 48864
Feb 16 12:31:06 compute-1 sshd-session[30639]: Connection closed by invalid user mysql 104.248.93.62 port 48864 [preauth]
Feb 16 12:31:49 compute-1 sshd-session[30644]: Invalid user mysql from 104.248.93.62 port 48236
Feb 16 12:31:49 compute-1 sshd-session[30644]: Connection closed by invalid user mysql 104.248.93.62 port 48236 [preauth]
Feb 16 12:32:07 compute-1 sshd-session[30646]: Connection closed by authenticating user root 142.93.238.36 port 51620 [preauth]
Feb 16 12:32:33 compute-1 sshd-session[30648]: Invalid user mysql from 104.248.93.62 port 59084
Feb 16 12:32:34 compute-1 sshd-session[30648]: Connection closed by invalid user mysql 104.248.93.62 port 59084 [preauth]
Feb 16 12:33:18 compute-1 sshd-session[30650]: Invalid user mysql from 104.248.93.62 port 47208
Feb 16 12:33:18 compute-1 sshd-session[30650]: Connection closed by invalid user mysql 104.248.93.62 port 47208 [preauth]
Feb 16 12:33:19 compute-1 sshd-session[30652]: Connection closed by authenticating user root 142.93.238.36 port 57648 [preauth]
Feb 16 12:33:20 compute-1 sshd-session[30654]: Accepted publickey for zuul from 38.102.83.173 port 58044 ssh2: RSA SHA256:5K91TOH9TC6bf3w9o3FKU3gHEsx4E7+lyHRIvDXmBIc
Feb 16 12:33:20 compute-1 systemd-logind[821]: New session 8 of user zuul.
Feb 16 12:33:20 compute-1 systemd[1]: Started Session 8 of User zuul.
Feb 16 12:33:20 compute-1 sshd-session[30654]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:33:20 compute-1 python3[30730]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:33:22 compute-1 sudo[30844]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irijhabksdbzpukqcguvqztczstsuhcy ; /usr/bin/python3'
Feb 16 12:33:22 compute-1 sudo[30844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:22 compute-1 python3[30846]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:22 compute-1 sudo[30844]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:22 compute-1 sudo[30917]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlnkdnogwvorpiasqkfgxcykwpwycdaw ; /usr/bin/python3'
Feb 16 12:33:22 compute-1 sudo[30917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:22 compute-1 python3[30919]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.125659-34522-155723400996088/source mode=0755 _original_basename=delorean.repo follow=False checksum=cc4ab4695da8ec58c451521a3dd2f41014af145d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:22 compute-1 sudo[30917]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:22 compute-1 sudo[30943]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slyvobeevgujzctttyrpekmhpisqayjg ; /usr/bin/python3'
Feb 16 12:33:22 compute-1 sudo[30943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:23 compute-1 python3[30945]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:23 compute-1 sudo[30943]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:23 compute-1 sudo[31016]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdcuulpufrqfiomhxjahoxfaplodxgxu ; /usr/bin/python3'
Feb 16 12:33:23 compute-1 sudo[31016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:23 compute-1 python3[31018]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.125659-34522-155723400996088/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:23 compute-1 sudo[31016]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:23 compute-1 sudo[31042]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peguyrkrkvqqragrzmzhspvihuupnrbz ; /usr/bin/python3'
Feb 16 12:33:23 compute-1 sudo[31042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:23 compute-1 python3[31044]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:23 compute-1 sudo[31042]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:23 compute-1 sudo[31115]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urqnlrisjujcoknbsbvcfqsrxqpcruwv ; /usr/bin/python3'
Feb 16 12:33:23 compute-1 sudo[31115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:23 compute-1 python3[31117]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.125659-34522-155723400996088/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:23 compute-1 sudo[31115]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:23 compute-1 sudo[31142]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhvdjpgbxlwvxpaibalogpywxwvijleu ; /usr/bin/python3'
Feb 16 12:33:23 compute-1 sudo[31142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:24 compute-1 python3[31144]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:24 compute-1 sudo[31142]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:24 compute-1 sudo[31215]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttgljxzykziubeyjibqkdheqydsvaskk ; /usr/bin/python3'
Feb 16 12:33:24 compute-1 sudo[31215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:24 compute-1 python3[31217]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.125659-34522-155723400996088/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:24 compute-1 sudo[31215]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:24 compute-1 sudo[31241]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erzynmhhhfjwbcylfjugftjycduyzhth ; /usr/bin/python3'
Feb 16 12:33:24 compute-1 sudo[31241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:24 compute-1 python3[31243]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:24 compute-1 sudo[31241]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:24 compute-1 sudo[31314]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svufrcgckijjszxxrnbhnycxtyvrohld ; /usr/bin/python3'
Feb 16 12:33:24 compute-1 sudo[31314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:24 compute-1 python3[31316]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.125659-34522-155723400996088/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:24 compute-1 sudo[31314]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:24 compute-1 sudo[31340]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjrfljkhjamwwgwaetnvnngagvsrruwp ; /usr/bin/python3'
Feb 16 12:33:24 compute-1 sudo[31340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:25 compute-1 python3[31342]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:25 compute-1 sudo[31340]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:25 compute-1 sudo[31413]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzqytrovpktitswoypxgnnjvnzfmpujv ; /usr/bin/python3'
Feb 16 12:33:25 compute-1 sudo[31413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:25 compute-1 python3[31415]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.125659-34522-155723400996088/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:25 compute-1 sudo[31413]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:25 compute-1 sudo[31439]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zunnqzcctbqwargogoyvtzjybxuklcmv ; /usr/bin/python3'
Feb 16 12:33:25 compute-1 sudo[31439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:25 compute-1 python3[31441]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 12:33:25 compute-1 sudo[31439]: pam_unix(sudo:session): session closed for user root
Feb 16 12:33:25 compute-1 sudo[31512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkzusqmzhftdsurxebpxaclnakaunnuw ; /usr/bin/python3'
Feb 16 12:33:25 compute-1 sudo[31512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:33:25 compute-1 python3[31514]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771245202.125659-34522-155723400996088/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=362a603578148d54e8cd25942b88d7f471cc677a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:33:25 compute-1 sudo[31512]: pam_unix(sudo:session): session closed for user root
Feb 16 12:34:01 compute-1 sshd-session[31539]: Invalid user mysql from 104.248.93.62 port 60930
Feb 16 12:34:01 compute-1 sshd-session[31539]: Connection closed by invalid user mysql 104.248.93.62 port 60930 [preauth]
Feb 16 12:34:31 compute-1 sshd-session[31542]: Connection closed by authenticating user root 142.93.238.36 port 41770 [preauth]
Feb 16 12:34:46 compute-1 sshd-session[31544]: Invalid user mysql from 104.248.93.62 port 47746
Feb 16 12:34:46 compute-1 sshd-session[31544]: Connection closed by invalid user mysql 104.248.93.62 port 47746 [preauth]
Feb 16 12:35:32 compute-1 sshd-session[31547]: Invalid user mysql from 104.248.93.62 port 32930
Feb 16 12:35:32 compute-1 sshd-session[31547]: Connection closed by invalid user mysql 104.248.93.62 port 32930 [preauth]
Feb 16 12:35:43 compute-1 sshd-session[31549]: Connection closed by authenticating user root 142.93.238.36 port 45812 [preauth]
Feb 16 12:36:17 compute-1 sshd-session[31551]: Invalid user mysql from 104.248.93.62 port 49080
Feb 16 12:36:17 compute-1 sshd-session[31551]: Connection closed by invalid user mysql 104.248.93.62 port 49080 [preauth]
Feb 16 12:36:54 compute-1 sshd-session[31553]: Connection closed by authenticating user root 142.93.238.36 port 37106 [preauth]
Feb 16 12:37:01 compute-1 sshd-session[31555]: Invalid user git from 104.248.93.62 port 42890
Feb 16 12:37:01 compute-1 sshd-session[31555]: Connection closed by invalid user git 104.248.93.62 port 42890 [preauth]
Feb 16 12:37:43 compute-1 sshd-session[31557]: Invalid user git from 104.248.93.62 port 53376
Feb 16 12:37:43 compute-1 sshd-session[31557]: Connection closed by invalid user git 104.248.93.62 port 53376 [preauth]
Feb 16 12:38:04 compute-1 sshd-session[31559]: Connection closed by authenticating user root 142.93.238.36 port 49754 [preauth]
Feb 16 12:38:19 compute-1 python3[31585]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:38:25 compute-1 sshd-session[31587]: Invalid user git from 104.248.93.62 port 60604
Feb 16 12:38:25 compute-1 sshd-session[31587]: Connection closed by invalid user git 104.248.93.62 port 60604 [preauth]
Feb 16 12:39:07 compute-1 sshd-session[31589]: Invalid user git from 104.248.93.62 port 35064
Feb 16 12:39:07 compute-1 sshd-session[31589]: Connection closed by invalid user git 104.248.93.62 port 35064 [preauth]
Feb 16 12:39:14 compute-1 sshd-session[31591]: Connection closed by authenticating user root 142.93.238.36 port 33960 [preauth]
Feb 16 12:39:50 compute-1 sshd-session[31593]: Invalid user git from 104.248.93.62 port 55358
Feb 16 12:39:50 compute-1 sshd-session[31593]: Connection closed by invalid user git 104.248.93.62 port 55358 [preauth]
Feb 16 12:40:23 compute-1 sshd-session[31596]: Connection closed by authenticating user root 142.93.238.36 port 39902 [preauth]
Feb 16 12:40:33 compute-1 sshd-session[31598]: Invalid user git from 104.248.93.62 port 41728
Feb 16 12:40:33 compute-1 sshd-session[31598]: Connection closed by invalid user git 104.248.93.62 port 41728 [preauth]
Feb 16 12:41:18 compute-1 sshd-session[31600]: Invalid user git from 104.248.93.62 port 52668
Feb 16 12:41:18 compute-1 sshd-session[31600]: Connection closed by invalid user git 104.248.93.62 port 52668 [preauth]
Feb 16 12:41:32 compute-1 sshd-session[31602]: Connection closed by authenticating user root 142.93.238.36 port 43760 [preauth]
Feb 16 12:42:02 compute-1 sshd-session[31604]: Invalid user git from 104.248.93.62 port 56544
Feb 16 12:42:02 compute-1 sshd-session[31604]: Connection closed by invalid user git 104.248.93.62 port 56544 [preauth]
Feb 16 12:42:43 compute-1 sshd-session[31607]: Connection closed by authenticating user root 142.93.238.36 port 36154 [preauth]
Feb 16 12:42:45 compute-1 sshd-session[31609]: Invalid user git from 104.248.93.62 port 59142
Feb 16 12:42:45 compute-1 sshd-session[31609]: Connection closed by invalid user git 104.248.93.62 port 59142 [preauth]
Feb 16 12:43:19 compute-1 sshd-session[30657]: Received disconnect from 38.102.83.173 port 58044:11: disconnected by user
Feb 16 12:43:19 compute-1 sshd-session[30657]: Disconnected from user zuul 38.102.83.173 port 58044
Feb 16 12:43:19 compute-1 sshd-session[30654]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:43:19 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Feb 16 12:43:19 compute-1 systemd[1]: session-8.scope: Consumed 4.055s CPU time.
Feb 16 12:43:19 compute-1 systemd-logind[821]: Session 8 logged out. Waiting for processes to exit.
Feb 16 12:43:19 compute-1 systemd-logind[821]: Removed session 8.
Feb 16 12:43:29 compute-1 sshd-session[31612]: Invalid user git from 104.248.93.62 port 44584
Feb 16 12:43:29 compute-1 sshd-session[31612]: Connection closed by invalid user git 104.248.93.62 port 44584 [preauth]
Feb 16 12:43:52 compute-1 sshd-session[31614]: Connection closed by authenticating user root 142.93.238.36 port 53040 [preauth]
Feb 16 12:44:11 compute-1 sshd-session[31616]: Invalid user git from 104.248.93.62 port 39528
Feb 16 12:44:11 compute-1 sshd-session[31616]: Connection closed by invalid user git 104.248.93.62 port 39528 [preauth]
Feb 16 12:44:53 compute-1 sshd-session[31619]: Invalid user git from 104.248.93.62 port 33072
Feb 16 12:44:53 compute-1 sshd-session[31619]: Connection closed by invalid user git 104.248.93.62 port 33072 [preauth]
Feb 16 12:45:00 compute-1 sshd-session[31621]: Connection closed by authenticating user root 142.93.238.36 port 59416 [preauth]
Feb 16 12:45:36 compute-1 sshd-session[31624]: Invalid user git from 104.248.93.62 port 54946
Feb 16 12:45:36 compute-1 sshd-session[31624]: Connection closed by invalid user git 104.248.93.62 port 54946 [preauth]
Feb 16 12:46:08 compute-1 sshd-session[31626]: Connection closed by authenticating user root 142.93.238.36 port 46206 [preauth]
Feb 16 12:46:19 compute-1 sshd-session[31628]: Invalid user gerrit from 104.248.93.62 port 46204
Feb 16 12:46:20 compute-1 sshd-session[31628]: Connection closed by invalid user gerrit 104.248.93.62 port 46204 [preauth]
Feb 16 12:47:03 compute-1 sshd-session[31630]: Invalid user gerrit from 104.248.93.62 port 42928
Feb 16 12:47:03 compute-1 sshd-session[31630]: Connection closed by invalid user gerrit 104.248.93.62 port 42928 [preauth]
Feb 16 12:47:20 compute-1 sshd-session[31632]: Connection closed by authenticating user root 142.93.238.36 port 49068 [preauth]
Feb 16 12:47:50 compute-1 sshd-session[31634]: Invalid user gerrit from 104.248.93.62 port 57486
Feb 16 12:47:50 compute-1 sshd-session[31634]: Connection closed by invalid user gerrit 104.248.93.62 port 57486 [preauth]
Feb 16 12:48:32 compute-1 sshd-session[31636]: Connection closed by authenticating user root 142.93.238.36 port 39404 [preauth]
Feb 16 12:48:35 compute-1 sshd-session[31638]: Invalid user gerrit from 104.248.93.62 port 44366
Feb 16 12:48:35 compute-1 sshd-session[31638]: Connection closed by invalid user gerrit 104.248.93.62 port 44366 [preauth]
Feb 16 12:49:20 compute-1 sshd-session[31640]: Invalid user gerrit from 104.248.93.62 port 59852
Feb 16 12:49:20 compute-1 sshd-session[31640]: Connection closed by invalid user gerrit 104.248.93.62 port 59852 [preauth]
Feb 16 12:49:47 compute-1 sshd-session[31642]: Connection closed by authenticating user root 142.93.238.36 port 50248 [preauth]
Feb 16 12:50:05 compute-1 sshd-session[31644]: Invalid user gerrit from 104.248.93.62 port 44288
Feb 16 12:50:05 compute-1 sshd-session[31644]: Connection closed by invalid user gerrit 104.248.93.62 port 44288 [preauth]
Feb 16 12:50:48 compute-1 sshd-session[31648]: Invalid user gerrit from 104.248.93.62 port 58916
Feb 16 12:50:48 compute-1 sshd-session[31648]: Connection closed by invalid user gerrit 104.248.93.62 port 58916 [preauth]
Feb 16 12:50:59 compute-1 sshd-session[31650]: Connection closed by authenticating user root 142.93.238.36 port 48092 [preauth]
Feb 16 12:51:58 compute-1 sshd-session[31653]: Connection closed by authenticating user root 142.93.238.36 port 47990 [preauth]
Feb 16 12:52:56 compute-1 sshd-session[31655]: Connection closed by authenticating user root 142.93.238.36 port 56198 [preauth]
Feb 16 12:53:49 compute-1 sshd-session[31657]: Accepted publickey for zuul from 192.168.122.30 port 52316 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:53:49 compute-1 systemd-logind[821]: New session 9 of user zuul.
Feb 16 12:53:49 compute-1 systemd[1]: Started Session 9 of User zuul.
Feb 16 12:53:49 compute-1 sshd-session[31657]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:53:50 compute-1 python3.9[31810]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:53:51 compute-1 sudo[31989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtnjqemfwqeyehsdnheewlmnrodjolxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246431.2964368-44-8864926827788/AnsiballZ_command.py'
Feb 16 12:53:51 compute-1 sudo[31989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:53:51 compute-1 python3.9[31991]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:53:55 compute-1 sshd-session[32009]: Connection closed by authenticating user root 142.93.238.36 port 52714 [preauth]
Feb 16 12:53:59 compute-1 sudo[31989]: pam_unix(sudo:session): session closed for user root
Feb 16 12:53:59 compute-1 sshd-session[31660]: Connection closed by 192.168.122.30 port 52316
Feb 16 12:53:59 compute-1 sshd-session[31657]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:53:59 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Feb 16 12:53:59 compute-1 systemd[1]: session-9.scope: Consumed 7.351s CPU time.
Feb 16 12:53:59 compute-1 systemd-logind[821]: Session 9 logged out. Waiting for processes to exit.
Feb 16 12:53:59 compute-1 systemd-logind[821]: Removed session 9.
Feb 16 12:54:05 compute-1 sshd-session[32051]: Accepted publickey for zuul from 192.168.122.30 port 44970 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:54:05 compute-1 systemd-logind[821]: New session 10 of user zuul.
Feb 16 12:54:05 compute-1 systemd[1]: Started Session 10 of User zuul.
Feb 16 12:54:05 compute-1 sshd-session[32051]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:54:07 compute-1 python3.9[32204]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:07 compute-1 sshd-session[32054]: Connection closed by 192.168.122.30 port 44970
Feb 16 12:54:07 compute-1 sshd-session[32051]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:54:07 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Feb 16 12:54:07 compute-1 systemd-logind[821]: Session 10 logged out. Waiting for processes to exit.
Feb 16 12:54:07 compute-1 systemd-logind[821]: Removed session 10.
Feb 16 12:54:23 compute-1 sshd-session[32232]: Accepted publickey for zuul from 192.168.122.30 port 37072 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:54:23 compute-1 systemd-logind[821]: New session 11 of user zuul.
Feb 16 12:54:23 compute-1 systemd[1]: Started Session 11 of User zuul.
Feb 16 12:54:23 compute-1 sshd-session[32232]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:54:25 compute-1 python3.9[32385]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 16 12:54:26 compute-1 python3.9[32559]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:27 compute-1 sudo[32709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifkkfcbqdauymukbzgsvhxfyrbxwmpdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246466.68639-70-123813005657042/AnsiballZ_command.py'
Feb 16 12:54:27 compute-1 sudo[32709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:27 compute-1 python3.9[32711]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:54:27 compute-1 sudo[32709]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:28 compute-1 sudo[32862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejqybdijhzdwzjlkiyifcubrkqztejrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246467.607952-94-137919315888966/AnsiballZ_stat.py'
Feb 16 12:54:28 compute-1 sudo[32862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:28 compute-1 python3.9[32864]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 12:54:28 compute-1 sudo[32862]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:29 compute-1 sudo[33014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bejqxzmsgcklqihsejeimhgoztjylaow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246468.7149985-110-247329233755181/AnsiballZ_file.py'
Feb 16 12:54:29 compute-1 sudo[33014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:29 compute-1 python3.9[33016]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:54:29 compute-1 sudo[33014]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:29 compute-1 sudo[33166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fytgqqitpvvcjioyfaoolxdetievmtgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246469.6170797-126-83000900799520/AnsiballZ_stat.py'
Feb 16 12:54:29 compute-1 sudo[33166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:30 compute-1 python3.9[33168]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:54:30 compute-1 sudo[33166]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:30 compute-1 sudo[33289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycmyzgdtzxkltenubavwwgrryhzlrwwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246469.6170797-126-83000900799520/AnsiballZ_copy.py'
Feb 16 12:54:30 compute-1 sudo[33289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:30 compute-1 python3.9[33291]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246469.6170797-126-83000900799520/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:54:30 compute-1 sudo[33289]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:31 compute-1 sudo[33441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhnpvppcjoqixwbszfwiahcplrpepcrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246470.999659-156-134990413875783/AnsiballZ_setup.py'
Feb 16 12:54:31 compute-1 sudo[33441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:31 compute-1 python3.9[33443]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:31 compute-1 sudo[33441]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:32 compute-1 sudo[33597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmrzcyltpybzxvdzlxmketkeuxiitvco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246471.971752-172-224562751459850/AnsiballZ_file.py'
Feb 16 12:54:32 compute-1 sudo[33597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:32 compute-1 python3.9[33599]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:54:32 compute-1 sudo[33597]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:33 compute-1 sudo[33749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqjsbnfowspdqyoestzbiekozkxejdlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246472.9723642-190-272813514592935/AnsiballZ_file.py'
Feb 16 12:54:33 compute-1 sudo[33749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:33 compute-1 python3.9[33751]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:54:33 compute-1 sudo[33749]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:34 compute-1 python3.9[33901]: ansible-ansible.builtin.service_facts Invoked
Feb 16 12:54:38 compute-1 python3.9[34155]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:54:38 compute-1 python3.9[34305]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:40 compute-1 python3.9[34459]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:54:41 compute-1 sudo[34615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aotfaaajgnzqoegldivvrxdnxcltkaqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246480.9414442-286-193163247099080/AnsiballZ_setup.py'
Feb 16 12:54:41 compute-1 sudo[34615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:41 compute-1 python3.9[34617]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 12:54:41 compute-1 sudo[34615]: pam_unix(sudo:session): session closed for user root
Feb 16 12:54:42 compute-1 sudo[34699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofetuuqoefqohccmvjyqaooedjoyuxkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246480.9414442-286-193163247099080/AnsiballZ_dnf.py'
Feb 16 12:54:42 compute-1 sudo[34699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:54:42 compute-1 python3.9[34701]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 12:54:55 compute-1 sshd-session[34775]: Connection closed by authenticating user root 142.93.238.36 port 42436 [preauth]
Feb 16 12:55:31 compute-1 systemd[1]: Reloading.
Feb 16 12:55:31 compute-1 systemd-rc-local-generator[34898]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:55:31 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 16 12:55:31 compute-1 systemd[1]: Reloading.
Feb 16 12:55:32 compute-1 systemd-rc-local-generator[34949]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:55:32 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 16 12:55:32 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 16 12:55:32 compute-1 systemd[1]: Reloading.
Feb 16 12:55:32 compute-1 systemd-rc-local-generator[35000]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:55:32 compute-1 systemd[1]: Starting dnf makecache...
Feb 16 12:55:32 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Feb 16 12:55:32 compute-1 dnf[35015]: Failed determining last makecache time.
Feb 16 12:55:32 compute-1 dbus-broker-launch[784]: Noticed file-system modification, trigger reload.
Feb 16 12:55:32 compute-1 dbus-broker-launch[784]: Noticed file-system modification, trigger reload.
Feb 16 12:55:32 compute-1 dbus-broker-launch[784]: Noticed file-system modification, trigger reload.
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-barbican-42b4c41831408a8e323 129 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-python-glean-642fffe0203a8ffcc2443db52 153 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-cinder-1c00d6490d88e436f26ef 150 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-python-stevedore-c4acc5639fd2329372142 155 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-python-cloudkitty-tests-tempest-783703 148 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-diskimage-builder-61b717cc45660834fe9a 148 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-nova-eaa65f0b85123a4ee343246 148 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-python-designate-tests-tempest-347fdbc 155 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-glance-1fd12c29b339f30fe823e 190 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 172 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-manila-d783d10e75495b73866db 157 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-neutron-95cadbd379667c8520c8 184 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-octavia-5975097dd4b021385178 175 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-watcher-c014f81a8647287f6dcc 171 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-python-tcib-78032d201b02cee27e8e644c61 163 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 187 kB/s | 3.0 kB     00:00
Feb 16 12:55:32 compute-1 dnf[35015]: delorean-openstack-swift-dc98a8463506ac520c469a 177 kB/s | 3.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: delorean-python-tempestconf-8515371b7cceebd4282 209 kB/s | 3.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: delorean-openstack-heat-ui-013accbfd179753bc3f0 197 kB/s | 3.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: CentOS Stream 9 - BaseOS                         73 kB/s | 7.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: CentOS Stream 9 - AppStream                      72 kB/s | 7.1 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: CentOS Stream 9 - CRB                            69 kB/s | 6.9 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: CentOS Stream 9 - Extras packages                70 kB/s | 7.6 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: dlrn-antelope-testing                           139 kB/s | 3.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: dlrn-antelope-build-deps                        145 kB/s | 3.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: centos9-rabbitmq                                 53 kB/s | 3.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: centos9-storage                                  40 kB/s | 3.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: centos9-opstools                                 45 kB/s | 3.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: NFV SIG OpenvSwitch                              78 kB/s | 3.0 kB     00:00
Feb 16 12:55:33 compute-1 dnf[35015]: repo-setup-centos-appstream                     149 kB/s | 4.4 kB     00:00
Feb 16 12:55:34 compute-1 dnf[35015]: repo-setup-centos-baseos                        168 kB/s | 3.9 kB     00:00
Feb 16 12:55:34 compute-1 dnf[35015]: repo-setup-centos-highavailability              176 kB/s | 3.9 kB     00:00
Feb 16 12:55:34 compute-1 dnf[35015]: repo-setup-centos-powertools                    177 kB/s | 4.3 kB     00:00
Feb 16 12:55:34 compute-1 dnf[35015]: Extra Packages for Enterprise Linux 9 - x86_64  230 kB/s |  29 kB     00:00
Feb 16 12:55:35 compute-1 dnf[35015]: Metadata cache created.
Feb 16 12:55:35 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 16 12:55:35 compute-1 systemd[1]: Finished dnf makecache.
Feb 16 12:55:35 compute-1 systemd[1]: dnf-makecache.service: Consumed 1.872s CPU time.
Feb 16 12:55:53 compute-1 sshd-session[35145]: Connection closed by authenticating user root 142.93.238.36 port 34920 [preauth]
Feb 16 12:56:31 compute-1 kernel: SELinux:  Converting 2727 SID table entries...
Feb 16 12:56:31 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 12:56:31 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 16 12:56:31 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 12:56:31 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 16 12:56:31 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 12:56:31 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 12:56:31 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 12:56:31 compute-1 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 16 12:56:31 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 12:56:31 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 16 12:56:31 compute-1 systemd[1]: Reloading.
Feb 16 12:56:31 compute-1 systemd-rc-local-generator[35378]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:56:32 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 12:56:32 compute-1 sudo[34699]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:32 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 12:56:32 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 16 12:56:32 compute-1 systemd[1]: run-r8bd9ed85b99240de80d860443c16482e.service: Deactivated successfully.
Feb 16 12:56:32 compute-1 sudo[36297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fufxerhbdhekhkxikwnugbkxguuikcce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246592.6334713-310-197320589331933/AnsiballZ_command.py'
Feb 16 12:56:32 compute-1 sudo[36297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:33 compute-1 python3.9[36299]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:56:34 compute-1 sudo[36297]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:34 compute-1 sudo[36578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okculpcoglvtvtpllclcdlsyofbjettu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246594.2175183-327-129889079813411/AnsiballZ_selinux.py'
Feb 16 12:56:34 compute-1 sudo[36578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:35 compute-1 python3.9[36580]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 16 12:56:35 compute-1 sudo[36578]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:35 compute-1 sudo[36730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvqubxpjzbjeszunyihbzslgirjvilfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246595.51969-348-154229690721353/AnsiballZ_command.py'
Feb 16 12:56:35 compute-1 sudo[36730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:35 compute-1 python3.9[36732]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 16 12:56:37 compute-1 sudo[36730]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:37 compute-1 sudo[36883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpjvfxkabthgwiqaclvsxilkkaotepuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246597.5425024-364-141196380794342/AnsiballZ_file.py'
Feb 16 12:56:37 compute-1 sudo[36883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:38 compute-1 python3.9[36885]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:56:38 compute-1 sudo[36883]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:40 compute-1 sudo[37035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdycfponnclcwicozttqcjbrhcwuwabq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246600.1171095-382-13437337239867/AnsiballZ_mount.py'
Feb 16 12:56:40 compute-1 sudo[37035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:40 compute-1 python3.9[37037]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 16 12:56:40 compute-1 sudo[37035]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:41 compute-1 sudo[37187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofndpffvyzxuvusncazvaojnakatlndn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246601.6371336-436-44898857731128/AnsiballZ_file.py'
Feb 16 12:56:41 compute-1 sudo[37187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:42 compute-1 python3.9[37189]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:56:42 compute-1 sudo[37187]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:42 compute-1 sudo[37339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrpktskibiuxxdihdplangdyymmvbrqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246602.3208735-452-54080605257438/AnsiballZ_stat.py'
Feb 16 12:56:42 compute-1 sudo[37339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:42 compute-1 python3.9[37341]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:56:42 compute-1 sudo[37339]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:43 compute-1 sudo[37462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hitcmqoowaxkyluygbvtnsqqhgrnpduw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246602.3208735-452-54080605257438/AnsiballZ_copy.py'
Feb 16 12:56:43 compute-1 sudo[37462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:45 compute-1 python3.9[37464]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246602.3208735-452-54080605257438/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:56:45 compute-1 sudo[37462]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:46 compute-1 sudo[37614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhkuarkgzhhyjcosfeiczvkiuypfhxbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246606.6255732-500-249478931201409/AnsiballZ_stat.py'
Feb 16 12:56:46 compute-1 sudo[37614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:47 compute-1 python3.9[37616]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 12:56:47 compute-1 sudo[37614]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:47 compute-1 sudo[37766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhfhrqnmaydavohmkpidyiqqjhegnqoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246607.2676651-516-135971253048716/AnsiballZ_command.py'
Feb 16 12:56:47 compute-1 sudo[37766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:47 compute-1 python3.9[37768]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:56:47 compute-1 sudo[37766]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:48 compute-1 sudo[37919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pujhwjyjvesenroyjtcxrjmkjyjobyud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246607.9889138-532-169127785056199/AnsiballZ_file.py'
Feb 16 12:56:48 compute-1 sudo[37919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:48 compute-1 python3.9[37921]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:56:48 compute-1 sudo[37919]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:49 compute-1 sudo[38071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwzryxbzpholrtugrgsslryyuoivguso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246608.911972-554-227521599652292/AnsiballZ_getent.py'
Feb 16 12:56:49 compute-1 sudo[38071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:49 compute-1 python3.9[38073]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 16 12:56:49 compute-1 sudo[38071]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:49 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 12:56:50 compute-1 sudo[38227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmgayljnzxnqscpgmeixyypgmltmdrmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246609.673075-570-209664798249692/AnsiballZ_group.py'
Feb 16 12:56:50 compute-1 sudo[38227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:50 compute-1 python3.9[38229]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 12:56:50 compute-1 groupadd[38230]: group added to /etc/group: name=qemu, GID=107
Feb 16 12:56:50 compute-1 groupadd[38230]: group added to /etc/gshadow: name=qemu
Feb 16 12:56:50 compute-1 groupadd[38230]: new group: name=qemu, GID=107
Feb 16 12:56:50 compute-1 sudo[38227]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:50 compute-1 sshd-session[38152]: Connection closed by authenticating user root 142.93.238.36 port 46062 [preauth]
Feb 16 12:56:50 compute-1 sudo[38385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyafddjgvvrxceculdwczwmbnigmfwkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246610.533943-586-210524396763170/AnsiballZ_user.py'
Feb 16 12:56:50 compute-1 sudo[38385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:51 compute-1 python3.9[38387]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 12:56:51 compute-1 useradd[38389]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 12:56:51 compute-1 sudo[38385]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:51 compute-1 sudo[38545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpftxhuufhvwbibybfsndxuigrshxeig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246611.4121265-602-49263765846208/AnsiballZ_getent.py'
Feb 16 12:56:51 compute-1 sudo[38545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:51 compute-1 python3.9[38547]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 16 12:56:51 compute-1 sudo[38545]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:52 compute-1 sudo[38698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqciyozmvmioqxtdzuerqzgyssoemkgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246612.0324717-618-254651638930011/AnsiballZ_group.py'
Feb 16 12:56:52 compute-1 sudo[38698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:52 compute-1 python3.9[38700]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 12:56:52 compute-1 groupadd[38701]: group added to /etc/group: name=hugetlbfs, GID=42477
Feb 16 12:56:52 compute-1 groupadd[38701]: group added to /etc/gshadow: name=hugetlbfs
Feb 16 12:56:52 compute-1 groupadd[38701]: new group: name=hugetlbfs, GID=42477
Feb 16 12:56:52 compute-1 sudo[38698]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:52 compute-1 sudo[38856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxcihbvwspxqycexcemvxonodvthueau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246612.7799876-636-59764238868715/AnsiballZ_file.py'
Feb 16 12:56:52 compute-1 sudo[38856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:53 compute-1 python3.9[38858]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 16 12:56:53 compute-1 sudo[38856]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:53 compute-1 sudo[39008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slusexwecwwcwjxmqsohdyztuxbrckdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246613.6460779-658-124070521019350/AnsiballZ_dnf.py'
Feb 16 12:56:53 compute-1 sudo[39008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:54 compute-1 python3.9[39010]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 12:56:56 compute-1 sudo[39008]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:56 compute-1 sudo[39161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygnucrglscixkrdfuyllpyrlgcbnxvtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246616.36086-674-4797402226021/AnsiballZ_file.py'
Feb 16 12:56:56 compute-1 sudo[39161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:56 compute-1 python3.9[39163]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:56:56 compute-1 sudo[39161]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:57 compute-1 sudo[39313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfwzairauoggekocerhglwzxmwdgwsrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246617.181563-690-273012383771752/AnsiballZ_stat.py'
Feb 16 12:56:57 compute-1 sudo[39313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:57 compute-1 python3.9[39315]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:56:57 compute-1 sudo[39313]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:57 compute-1 sudo[39436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjrqcfnicnawlboactjwiccazuxxygnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246617.181563-690-273012383771752/AnsiballZ_copy.py'
Feb 16 12:56:57 compute-1 sudo[39436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:57 compute-1 python3.9[39438]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246617.181563-690-273012383771752/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:56:58 compute-1 sudo[39436]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:58 compute-1 sudo[39588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aijemndbjdgwrxuvaohvidvcfhuupwxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246618.3575432-720-131808055516964/AnsiballZ_systemd.py'
Feb 16 12:56:58 compute-1 sudo[39588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:56:59 compute-1 python3.9[39590]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 12:56:59 compute-1 systemd[1]: Starting Load Kernel Modules...
Feb 16 12:56:59 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 16 12:56:59 compute-1 kernel: Bridge firewalling registered
Feb 16 12:56:59 compute-1 systemd-modules-load[39594]: Inserted module 'br_netfilter'
Feb 16 12:56:59 compute-1 systemd[1]: Finished Load Kernel Modules.
Feb 16 12:56:59 compute-1 sudo[39588]: pam_unix(sudo:session): session closed for user root
Feb 16 12:56:59 compute-1 sudo[39747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmwfdejoedstficokaxgtwvtkmpbvyeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246619.631342-736-102101220832482/AnsiballZ_stat.py'
Feb 16 12:56:59 compute-1 sudo[39747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:00 compute-1 python3.9[39749]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:57:00 compute-1 sudo[39747]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:00 compute-1 sudo[39870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blitzqvwblhumwncsexmypcwfzsbbqck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246619.631342-736-102101220832482/AnsiballZ_copy.py'
Feb 16 12:57:00 compute-1 sudo[39870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:00 compute-1 python3.9[39872]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246619.631342-736-102101220832482/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:00 compute-1 sudo[39870]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:01 compute-1 sudo[40022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzwcleedqvjphuahbdfnhhgmjxucqvxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246620.9870677-772-170816259168927/AnsiballZ_dnf.py'
Feb 16 12:57:01 compute-1 sudo[40022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:01 compute-1 python3.9[40024]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 12:57:08 compute-1 dbus-broker-launch[784]: Noticed file-system modification, trigger reload.
Feb 16 12:57:08 compute-1 dbus-broker-launch[784]: Noticed file-system modification, trigger reload.
Feb 16 12:57:09 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 12:57:09 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 16 12:57:09 compute-1 systemd[1]: Reloading.
Feb 16 12:57:09 compute-1 systemd-rc-local-generator[40086]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:57:09 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 12:57:09 compute-1 sudo[40022]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:10 compute-1 python3.9[41567]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 12:57:11 compute-1 python3.9[42647]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 16 12:57:12 compute-1 python3.9[43444]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 12:57:12 compute-1 sudo[44262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvjoahejlqabuwauzjfxabisoljpbava ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246632.4267943-850-258271350965545/AnsiballZ_command.py'
Feb 16 12:57:12 compute-1 sudo[44262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:12 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 12:57:12 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 16 12:57:12 compute-1 systemd[1]: man-db-cache-update.service: Consumed 4.421s CPU time.
Feb 16 12:57:12 compute-1 systemd[1]: run-r0090dcb4c4e6462f856a0aa22779534e.service: Deactivated successfully.
Feb 16 12:57:12 compute-1 python3.9[44264]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:12 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 16 12:57:13 compute-1 systemd[1]: Starting Authorization Manager...
Feb 16 12:57:13 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 16 12:57:13 compute-1 polkitd[44483]: Started polkitd version 0.117
Feb 16 12:57:13 compute-1 polkitd[44483]: Loading rules from directory /etc/polkit-1/rules.d
Feb 16 12:57:13 compute-1 polkitd[44483]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 16 12:57:13 compute-1 polkitd[44483]: Finished loading, compiling and executing 2 rules
Feb 16 12:57:13 compute-1 systemd[1]: Started Authorization Manager.
Feb 16 12:57:13 compute-1 polkitd[44483]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 16 12:57:13 compute-1 sudo[44262]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:14 compute-1 sudo[44651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xehijmmsxvfvxeiegjylruexsfzlsoqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246634.0007722-868-267327339870471/AnsiballZ_systemd.py'
Feb 16 12:57:14 compute-1 sudo[44651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:14 compute-1 python3.9[44653]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 12:57:14 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 16 12:57:14 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Feb 16 12:57:14 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 16 12:57:14 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 16 12:57:14 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 16 12:57:14 compute-1 sudo[44651]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:15 compute-1 python3.9[44814]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 16 12:57:18 compute-1 sudo[44964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uacrhracxmtfbzcvfltpklshlazfitcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246638.4979093-982-169487191826159/AnsiballZ_systemd.py'
Feb 16 12:57:18 compute-1 sudo[44964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:19 compute-1 python3.9[44966]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 12:57:19 compute-1 systemd[1]: Reloading.
Feb 16 12:57:20 compute-1 systemd-rc-local-generator[44988]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:57:20 compute-1 sudo[44964]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:20 compute-1 sudo[45161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tezwgsypzhjmshvcpgxcrdjfjgwifpjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246640.2929244-982-3967343218263/AnsiballZ_systemd.py'
Feb 16 12:57:20 compute-1 sudo[45161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:20 compute-1 python3.9[45163]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 12:57:20 compute-1 systemd[1]: Reloading.
Feb 16 12:57:20 compute-1 systemd-rc-local-generator[45193]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 12:57:21 compute-1 sudo[45161]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:21 compute-1 sudo[45357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pugkpwldjvqvhcqaowcgiynsrddwttvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246641.3547175-1014-47791131258517/AnsiballZ_command.py'
Feb 16 12:57:21 compute-1 sudo[45357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:21 compute-1 python3.9[45359]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:21 compute-1 sudo[45357]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:22 compute-1 sudo[45510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmuhunjrvtqappntucunoklsinajcqbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246642.0502415-1030-3981588953941/AnsiballZ_command.py'
Feb 16 12:57:22 compute-1 sudo[45510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:22 compute-1 python3.9[45512]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:22 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb 16 12:57:22 compute-1 sudo[45510]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:22 compute-1 sudo[45663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfkrismvndzmgdqhwcrgmsvzeuczixyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246642.7148833-1046-243786505925307/AnsiballZ_command.py'
Feb 16 12:57:22 compute-1 sudo[45663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:23 compute-1 python3.9[45665]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:24 compute-1 sudo[45663]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:25 compute-1 sudo[45825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqrrypmmvzrernsparrorbariuovsafb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246644.8918643-1062-185655986578416/AnsiballZ_command.py'
Feb 16 12:57:25 compute-1 sudo[45825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:25 compute-1 python3.9[45827]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:25 compute-1 sudo[45825]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:25 compute-1 sudo[45978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iufasnhgbgtdoapbalzflukdaklrknqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246645.5265648-1078-244127098807358/AnsiballZ_systemd.py'
Feb 16 12:57:25 compute-1 sudo[45978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:26 compute-1 python3.9[45980]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 12:57:26 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 16 12:57:26 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Feb 16 12:57:26 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Feb 16 12:57:26 compute-1 systemd[1]: Starting Apply Kernel Variables...
Feb 16 12:57:26 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 16 12:57:26 compute-1 systemd[1]: Finished Apply Kernel Variables.
Feb 16 12:57:26 compute-1 sudo[45978]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:26 compute-1 sshd-session[32235]: Connection closed by 192.168.122.30 port 37072
Feb 16 12:57:26 compute-1 sshd-session[32232]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:57:26 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Feb 16 12:57:26 compute-1 systemd[1]: session-11.scope: Consumed 2min 12.273s CPU time.
Feb 16 12:57:26 compute-1 systemd-logind[821]: Session 11 logged out. Waiting for processes to exit.
Feb 16 12:57:26 compute-1 systemd-logind[821]: Removed session 11.
Feb 16 12:57:32 compute-1 sshd-session[46010]: Accepted publickey for zuul from 192.168.122.30 port 44070 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:57:32 compute-1 systemd-logind[821]: New session 12 of user zuul.
Feb 16 12:57:32 compute-1 systemd[1]: Started Session 12 of User zuul.
Feb 16 12:57:32 compute-1 sshd-session[46010]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:57:33 compute-1 python3.9[46163]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:57:34 compute-1 python3.9[46317]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:57:36 compute-1 sudo[46471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icposummqcohykhpjcgdelviilnaykbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246655.5736978-81-141889507702702/AnsiballZ_command.py'
Feb 16 12:57:36 compute-1 sudo[46471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:36 compute-1 python3.9[46473]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:36 compute-1 sudo[46471]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:37 compute-1 python3.9[46624]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:57:37 compute-1 sudo[46778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxcukpjamqrjtvxvcwwxumeiunspykzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246657.7277548-121-24098915039039/AnsiballZ_setup.py'
Feb 16 12:57:37 compute-1 sudo[46778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:38 compute-1 python3.9[46780]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 12:57:38 compute-1 sudo[46778]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:38 compute-1 sudo[46862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycykbjijkyosuqorwdzpdjseohqbclvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246657.7277548-121-24098915039039/AnsiballZ_dnf.py'
Feb 16 12:57:38 compute-1 sudo[46862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:39 compute-1 python3.9[46864]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 12:57:40 compute-1 sudo[46862]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:41 compute-1 sudo[47015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upsotpecgsscxjgbcaxigeatcfogvnxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246660.9795783-145-208069889066403/AnsiballZ_setup.py'
Feb 16 12:57:41 compute-1 sudo[47015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:41 compute-1 python3.9[47017]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 12:57:41 compute-1 sudo[47015]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:42 compute-1 sudo[47186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcckcpsrvbbkkwuwkzzcfjvujngizcei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246661.9289093-167-136914221086663/AnsiballZ_file.py'
Feb 16 12:57:42 compute-1 sudo[47186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:42 compute-1 python3.9[47188]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:57:42 compute-1 sudo[47186]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:43 compute-1 sudo[47338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrarojoornkkbytlkfcciacouqvhqmwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246662.754219-183-251836594488726/AnsiballZ_command.py'
Feb 16 12:57:43 compute-1 sudo[47338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:43 compute-1 python3.9[47340]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 12:57:43 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat2954874597-merged.mount: Deactivated successfully.
Feb 16 12:57:43 compute-1 podman[47341]: 2026-02-16 12:57:43.294194278 +0000 UTC m=+0.064222291 system refresh
Feb 16 12:57:43 compute-1 sudo[47338]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:43 compute-1 sudo[47502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htjtknpttulhvlcgjcwmyowglhbpsbbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246663.5147276-199-84521623905405/AnsiballZ_stat.py'
Feb 16 12:57:43 compute-1 sudo[47502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:44 compute-1 python3.9[47504]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:57:44 compute-1 sudo[47502]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:44 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:57:44 compute-1 sudo[47625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwqlhojhicosucmgzrhxwdvnqmmetuoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246663.5147276-199-84521623905405/AnsiballZ_copy.py'
Feb 16 12:57:44 compute-1 sudo[47625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:44 compute-1 python3.9[47627]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246663.5147276-199-84521623905405/.source.json follow=False _original_basename=podman_network_config.j2 checksum=a100366246609640c5e9834fdaecac85f6cf6992 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:57:45 compute-1 sudo[47625]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:45 compute-1 sudo[47777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jctjsdtkmmgkknvywnnqyzfemwrpnvog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246665.2327127-229-262327603610681/AnsiballZ_stat.py'
Feb 16 12:57:45 compute-1 sudo[47777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:45 compute-1 python3.9[47779]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:57:45 compute-1 sudo[47777]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:46 compute-1 sudo[47900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrgdblahtrmpmkurwfdgbifulshbofpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246665.2327127-229-262327603610681/AnsiballZ_copy.py'
Feb 16 12:57:46 compute-1 sudo[47900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:46 compute-1 python3.9[47902]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246665.2327127-229-262327603610681/.source.conf follow=False _original_basename=registries.conf.j2 checksum=9b3bcfdba57b23b453cfef4c881e370a3c5d3bf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:46 compute-1 sudo[47900]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:46 compute-1 sudo[48054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnvbcxfzzvglqignhqtykrnfnusfyybw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246666.5174184-261-218372341589352/AnsiballZ_ini_file.py'
Feb 16 12:57:46 compute-1 sudo[48054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:47 compute-1 python3.9[48056]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:47 compute-1 sudo[48054]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:47 compute-1 sshd-session[47979]: Connection closed by authenticating user root 142.93.238.36 port 52632 [preauth]
Feb 16 12:57:47 compute-1 sudo[48206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taeecyhjrejphesdkrvxcdpnxpiuydyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246667.22528-261-180484793111939/AnsiballZ_ini_file.py'
Feb 16 12:57:47 compute-1 sudo[48206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:47 compute-1 python3.9[48208]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:47 compute-1 sudo[48206]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:47 compute-1 sudo[48358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebawfmsuvvprbfvproriloiouahlbwql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246667.7529502-261-118400539433877/AnsiballZ_ini_file.py'
Feb 16 12:57:47 compute-1 sudo[48358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:48 compute-1 python3.9[48360]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:48 compute-1 sudo[48358]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:48 compute-1 sudo[48510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgevjjnqqvgpuogrsebefhdvsounzbpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246668.2528074-261-213281742799115/AnsiballZ_ini_file.py'
Feb 16 12:57:48 compute-1 sudo[48510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:48 compute-1 python3.9[48512]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 12:57:48 compute-1 sudo[48510]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:49 compute-1 python3.9[48662]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:57:50 compute-1 sudo[48814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fngpmqksdcwvqqihaxutpssgdwmrcezm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246670.2252865-341-10426711067694/AnsiballZ_dnf.py'
Feb 16 12:57:50 compute-1 sudo[48814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:50 compute-1 python3.9[48816]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:57:52 compute-1 sudo[48814]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:52 compute-1 sudo[48967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gerdxurkmxyfebzuzflgzkbqrdfibuvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246672.3359926-357-112687028838056/AnsiballZ_dnf.py'
Feb 16 12:57:52 compute-1 sudo[48967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:52 compute-1 python3.9[48969]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:57:55 compute-1 sudo[48967]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:55 compute-1 sudo[49127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdloupdeyddrnnfwrooybycfpgjpscen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246675.5526984-377-171741972403761/AnsiballZ_dnf.py'
Feb 16 12:57:55 compute-1 sudo[49127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:56 compute-1 python3.9[49129]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:57:57 compute-1 sudo[49127]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:57 compute-1 sudo[49280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojyltkdvsxsmjxfgpixbtthlirhfkzbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246677.624141-395-74721553837869/AnsiballZ_dnf.py'
Feb 16 12:57:57 compute-1 sudo[49280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:57:58 compute-1 python3.9[49282]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:57:59 compute-1 sudo[49280]: pam_unix(sudo:session): session closed for user root
Feb 16 12:57:59 compute-1 sudo[49433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyucyygwpwezspkoeiviubudhhbiudft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246679.764313-417-186426254118004/AnsiballZ_dnf.py'
Feb 16 12:57:59 compute-1 sudo[49433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:00 compute-1 python3.9[49435]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:01 compute-1 sudo[49433]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:02 compute-1 sudo[49589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdmbbpwwbnwedepxfrrmadcizmdmzhoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246682.113525-433-224705631406652/AnsiballZ_dnf.py'
Feb 16 12:58:02 compute-1 sudo[49589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:02 compute-1 python3.9[49591]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:13 compute-1 sudo[49589]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:13 compute-1 sudo[49759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krhknazbhcjhfqgfmrtxlhlawevkmbga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246693.3000538-451-277971713982988/AnsiballZ_dnf.py'
Feb 16 12:58:13 compute-1 sudo[49759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:13 compute-1 python3.9[49761]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:15 compute-1 sudo[49759]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:15 compute-1 sudo[49912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjkhurjgkopliqqfdqpuwlseokiiambi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246695.357674-469-124430923754798/AnsiballZ_dnf.py'
Feb 16 12:58:15 compute-1 sudo[49912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:15 compute-1 python3.9[49914]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:42 compute-1 sudo[49912]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:44 compute-1 sudo[50248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgfimotpaqnhpuvmufermcxjejamwsvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246724.4896061-487-5274962472546/AnsiballZ_dnf.py'
Feb 16 12:58:44 compute-1 sudo[50248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:44 compute-1 python3.9[50250]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:46 compute-1 sudo[50248]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:46 compute-1 sudo[50404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwcppwpuxnazvzciojsqlkgzcnmzvcqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246726.5754626-507-236658598761226/AnsiballZ_dnf.py'
Feb 16 12:58:46 compute-1 sudo[50404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:47 compute-1 python3.9[50406]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:58:47 compute-1 sshd-session[50407]: Connection closed by authenticating user root 142.93.238.36 port 42334 [preauth]
Feb 16 12:58:48 compute-1 sudo[50404]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:49 compute-1 sudo[50563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlodrynzlxjkskxphfadkvhjorggyxrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246729.3651178-529-118342792046155/AnsiballZ_file.py'
Feb 16 12:58:49 compute-1 sudo[50563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:49 compute-1 python3.9[50565]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:58:49 compute-1 sudo[50563]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:50 compute-1 sudo[50738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oolzthgdzsjjytobxpfohflsxbcvzkza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246729.990656-545-152779163393555/AnsiballZ_stat.py'
Feb 16 12:58:50 compute-1 sudo[50738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:50 compute-1 python3.9[50740]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 12:58:50 compute-1 sudo[50738]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:50 compute-1 sudo[50861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbmsolrwelmvmcgpcnmbdqvvzlyeyych ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246729.990656-545-152779163393555/AnsiballZ_copy.py'
Feb 16 12:58:50 compute-1 sudo[50861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:50 compute-1 python3.9[50863]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771246729.990656-545-152779163393555/.source.json _original_basename=.4dnvnhj0 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 12:58:50 compute-1 sudo[50861]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:51 compute-1 sudo[51013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olqrmdluyggvcnxkcztibzywxahsfwzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246731.3637886-581-1240769975473/AnsiballZ_podman_image.py'
Feb 16 12:58:51 compute-1 sudo[51013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:52 compute-1 python3.9[51015]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:58:52 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:58:54 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat715578937-lower\x2dmapped.mount: Deactivated successfully.
Feb 16 12:58:57 compute-1 podman[51028]: 2026-02-16 12:58:57.52882948 +0000 UTC m=+5.400361070 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 16 12:58:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:58:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:58:57 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:58:57 compute-1 sudo[51013]: pam_unix(sudo:session): session closed for user root
Feb 16 12:58:58 compute-1 sudo[51324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqxhtjzgivuwrwumujabfpqjflqmqygl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246738.1095314-603-143984459029188/AnsiballZ_podman_image.py'
Feb 16 12:58:58 compute-1 sudo[51324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:58:58 compute-1 python3.9[51326]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:58:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:07 compute-1 podman[51339]: 2026-02-16 12:59:07.442568191 +0000 UTC m=+8.872396083 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 12:59:07 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:07 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:07 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:08 compute-1 sudo[51324]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:09 compute-1 sudo[51636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsgtqasgskyjpjknmxwklznmitsxncrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246748.943166-623-33718965054728/AnsiballZ_podman_image.py'
Feb 16 12:59:09 compute-1 sudo[51636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:09 compute-1 python3.9[51638]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:59:09 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:23 compute-1 podman[51650]: 2026-02-16 12:59:23.704568618 +0000 UTC m=+14.256819455 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 16 12:59:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:23 compute-1 sudo[51636]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:24 compute-1 sudo[51920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzbxvdrdungdqiwmlpdiuviggwysdgvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246764.4505181-645-234156818053612/AnsiballZ_podman_image.py'
Feb 16 12:59:24 compute-1 sudo[51920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:24 compute-1 python3.9[51922]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:59:25 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:30 compute-1 podman[51934]: 2026-02-16 12:59:30.46574214 +0000 UTC m=+5.443003756 image pull be811c7ef606e5fdf21f4bb60e867487043c4ca0ef316c864692549ee6c1c369 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 16 12:59:30 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:30 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:30 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:30 compute-1 sudo[51920]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:31 compute-1 sudo[52198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruchetjrnpnlejazdmicnmaoynoojrcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246770.8052301-645-143640265679973/AnsiballZ_podman_image.py'
Feb 16 12:59:31 compute-1 sudo[52198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:31 compute-1 python3.9[52200]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 12:59:32 compute-1 podman[52212]: 2026-02-16 12:59:32.395065736 +0000 UTC m=+1.036453694 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 16 12:59:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:32 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 12:59:32 compute-1 sudo[52198]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:33 compute-1 sshd-session[46013]: Connection closed by 192.168.122.30 port 44070
Feb 16 12:59:33 compute-1 sshd-session[46010]: pam_unix(sshd:session): session closed for user zuul
Feb 16 12:59:33 compute-1 systemd-logind[821]: Session 12 logged out. Waiting for processes to exit.
Feb 16 12:59:33 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Feb 16 12:59:33 compute-1 systemd[1]: session-12.scope: Consumed 1min 44.061s CPU time.
Feb 16 12:59:33 compute-1 systemd-logind[821]: Removed session 12.
Feb 16 12:59:41 compute-1 sshd-session[52358]: Accepted publickey for zuul from 192.168.122.30 port 33228 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 12:59:41 compute-1 systemd-logind[821]: New session 13 of user zuul.
Feb 16 12:59:41 compute-1 systemd[1]: Started Session 13 of User zuul.
Feb 16 12:59:41 compute-1 sshd-session[52358]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 12:59:42 compute-1 python3.9[52511]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 12:59:44 compute-1 sudo[52665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpmdaahskvpktsrdvtlrpxutbwsgulsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246783.6241434-55-155211838526421/AnsiballZ_getent.py'
Feb 16 12:59:44 compute-1 sudo[52665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:44 compute-1 python3.9[52667]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 16 12:59:44 compute-1 sudo[52665]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:44 compute-1 sudo[52818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mecenbqiffagtdbmotvvhdmnckmarxgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246784.504341-71-99423549104004/AnsiballZ_group.py'
Feb 16 12:59:44 compute-1 sudo[52818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:45 compute-1 python3.9[52820]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 12:59:45 compute-1 groupadd[52821]: group added to /etc/group: name=openvswitch, GID=42476
Feb 16 12:59:45 compute-1 groupadd[52821]: group added to /etc/gshadow: name=openvswitch
Feb 16 12:59:45 compute-1 groupadd[52821]: new group: name=openvswitch, GID=42476
Feb 16 12:59:45 compute-1 sudo[52818]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:45 compute-1 sudo[52976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gniwpryqmfrgihsammdqpyhpqyetrjqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246785.4662197-87-257971790629650/AnsiballZ_user.py'
Feb 16 12:59:45 compute-1 sudo[52976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:46 compute-1 python3.9[52978]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 12:59:46 compute-1 useradd[52980]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 12:59:46 compute-1 useradd[52980]: add 'openvswitch' to group 'hugetlbfs'
Feb 16 12:59:46 compute-1 useradd[52980]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 16 12:59:46 compute-1 sudo[52976]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:47 compute-1 sudo[53136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llucuulwxcmbvzekvwmjquszmuyaaxrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246786.9102027-107-259408300831031/AnsiballZ_setup.py'
Feb 16 12:59:47 compute-1 sudo[53136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:47 compute-1 python3.9[53138]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 12:59:47 compute-1 sudo[53136]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:48 compute-1 sudo[53222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frtgnulpnkmqnwrdypwxzmkrgpqplllv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246786.9102027-107-259408300831031/AnsiballZ_dnf.py'
Feb 16 12:59:48 compute-1 sudo[53222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:48 compute-1 python3.9[53224]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 12:59:48 compute-1 sshd-session[53170]: Connection closed by authenticating user root 142.93.238.36 port 53996 [preauth]
Feb 16 12:59:49 compute-1 sudo[53222]: pam_unix(sudo:session): session closed for user root
Feb 16 12:59:50 compute-1 sshd-session[53259]: Connection closed by 146.190.226.24 port 36746
Feb 16 12:59:51 compute-1 sudo[53385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kterfzrqebvsslloztcwiuajfttzdbsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246791.3872955-135-53140588574730/AnsiballZ_dnf.py'
Feb 16 12:59:51 compute-1 sudo[53385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 12:59:51 compute-1 python3.9[53387]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:00:04 compute-1 kernel: SELinux:  Converting 2740 SID table entries...
Feb 16 13:00:04 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:00:04 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:00:04 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:00:04 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:00:04 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:00:04 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:00:04 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:00:04 compute-1 groupadd[53410]: group added to /etc/group: name=unbound, GID=994
Feb 16 13:00:04 compute-1 groupadd[53410]: group added to /etc/gshadow: name=unbound
Feb 16 13:00:04 compute-1 groupadd[53410]: new group: name=unbound, GID=994
Feb 16 13:00:04 compute-1 useradd[53417]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Feb 16 13:00:04 compute-1 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 16 13:00:04 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 16 13:00:05 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:00:05 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:00:05 compute-1 systemd[1]: Reloading.
Feb 16 13:00:05 compute-1 systemd-rc-local-generator[53907]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:00:05 compute-1 systemd-sysv-generator[53917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:00:05 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:00:06 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:00:06 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:00:06 compute-1 systemd[1]: run-r95e2502eb45e484f9cda591cd03994a9.service: Deactivated successfully.
Feb 16 13:00:06 compute-1 sudo[53385]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:07 compute-1 sudo[54494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hopjuoerjjdhysjhvcrquhffpwystzcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246806.6658876-151-82998274240537/AnsiballZ_systemd.py'
Feb 16 13:00:07 compute-1 sudo[54494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:07 compute-1 python3.9[54496]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:00:07 compute-1 systemd[1]: Reloading.
Feb 16 13:00:07 compute-1 systemd-rc-local-generator[54525]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:00:07 compute-1 systemd-sysv-generator[54528]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:00:07 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Feb 16 13:00:07 compute-1 chown[54545]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 16 13:00:07 compute-1 ovs-ctl[54550]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 16 13:00:07 compute-1 ovs-ctl[54550]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 16 13:00:08 compute-1 ovs-ctl[54550]: Starting ovsdb-server [  OK  ]
Feb 16 13:00:08 compute-1 ovs-vsctl[54599]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 16 13:00:08 compute-1 ovs-vsctl[54619]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"54c1a259-778a-4222-b2c6-8422ea19a065\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb 16 13:00:08 compute-1 ovs-ctl[54550]: Configuring Open vSwitch system IDs [  OK  ]
Feb 16 13:00:08 compute-1 ovs-ctl[54550]: Enabling remote OVSDB managers [  OK  ]
Feb 16 13:00:08 compute-1 ovs-vsctl[54625]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Feb 16 13:00:08 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Feb 16 13:00:08 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 16 13:00:08 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 16 13:00:08 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 16 13:00:08 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Feb 16 13:00:08 compute-1 ovs-ctl[54669]: Inserting openvswitch module [  OK  ]
Feb 16 13:00:08 compute-1 ovs-ctl[54638]: Starting ovs-vswitchd [  OK  ]
Feb 16 13:00:08 compute-1 ovs-ctl[54638]: Enabling remote OVSDB managers [  OK  ]
Feb 16 13:00:08 compute-1 ovs-vsctl[54687]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Feb 16 13:00:08 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 16 13:00:08 compute-1 systemd[1]: Starting Open vSwitch...
Feb 16 13:00:08 compute-1 systemd[1]: Finished Open vSwitch.
Feb 16 13:00:08 compute-1 sudo[54494]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:09 compute-1 python3.9[54838]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:00:09 compute-1 sudo[54988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqbatqozfncgnttmumjtwzrashdwtaqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246809.4897783-189-220243566802136/AnsiballZ_sefcontext.py'
Feb 16 13:00:09 compute-1 sudo[54988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:10 compute-1 python3.9[54990]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 16 13:00:11 compute-1 kernel: SELinux:  Converting 2754 SID table entries...
Feb 16 13:00:11 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:00:11 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:00:11 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:00:11 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:00:11 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:00:11 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:00:11 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:00:11 compute-1 sudo[54988]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:12 compute-1 python3.9[55145]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:00:12 compute-1 sudo[55301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rprdydjzyeqgynhgfaypqvzvjnrezufa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246812.5454836-225-177449148198546/AnsiballZ_dnf.py'
Feb 16 13:00:12 compute-1 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 16 13:00:12 compute-1 sudo[55301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:13 compute-1 python3.9[55303]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:00:14 compute-1 sudo[55301]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:15 compute-1 sudo[55454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neqpsilcxstigxobmpcxhbeyhrqxuuxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246814.8626173-241-213190435225823/AnsiballZ_command.py'
Feb 16 13:00:15 compute-1 sudo[55454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:15 compute-1 python3.9[55456]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:00:16 compute-1 sudo[55454]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:16 compute-1 sudo[55741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giteyepjftbfvfamcmttibhazzxbyehn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246816.3761806-257-206560062355688/AnsiballZ_file.py'
Feb 16 13:00:16 compute-1 sudo[55741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:17 compute-1 python3.9[55743]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 16 13:00:17 compute-1 sudo[55741]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:17 compute-1 python3.9[55893]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:00:18 compute-1 sudo[56045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdotmyjmftfxrfmuwbjeflauwynmklqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246818.1333172-289-77776640884284/AnsiballZ_dnf.py'
Feb 16 13:00:18 compute-1 sudo[56045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:18 compute-1 python3.9[56047]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:00:20 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:00:20 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:00:20 compute-1 systemd[1]: Reloading.
Feb 16 13:00:20 compute-1 systemd-rc-local-generator[56087]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:00:20 compute-1 systemd-sysv-generator[56092]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:00:20 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:00:21 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:00:21 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:00:21 compute-1 systemd[1]: run-rdad02feaf6984db38a7f23cadec55830.service: Deactivated successfully.
Feb 16 13:00:21 compute-1 sudo[56045]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:21 compute-1 sudo[56370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yistyicpyogugtjhqtwggpwagruxtrie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246821.4691322-305-279205542499368/AnsiballZ_systemd.py'
Feb 16 13:00:21 compute-1 sudo[56370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:21 compute-1 python3.9[56372]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:00:22 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 16 13:00:22 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Feb 16 13:00:22 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Feb 16 13:00:22 compute-1 systemd[1]: Stopping Network Manager...
Feb 16 13:00:22 compute-1 NetworkManager[7706]: <info>  [1771246822.0164] caught SIGTERM, shutting down normally.
Feb 16 13:00:22 compute-1 NetworkManager[7706]: <info>  [1771246822.0177] dhcp4 (eth0): canceled DHCP transaction
Feb 16 13:00:22 compute-1 NetworkManager[7706]: <info>  [1771246822.0177] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 13:00:22 compute-1 NetworkManager[7706]: <info>  [1771246822.0177] dhcp4 (eth0): state changed no lease
Feb 16 13:00:22 compute-1 NetworkManager[7706]: <info>  [1771246822.0179] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 13:00:22 compute-1 NetworkManager[7706]: <info>  [1771246822.0227] exiting (success)
Feb 16 13:00:22 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 13:00:22 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 13:00:22 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 16 13:00:22 compute-1 systemd[1]: Stopped Network Manager.
Feb 16 13:00:22 compute-1 systemd[1]: NetworkManager.service: Consumed 15.989s CPU time, 4.1M memory peak, read 0B from disk, written 18.0K to disk.
Feb 16 13:00:22 compute-1 systemd[1]: Starting Network Manager...
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.0772] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:cd836bab-140a-4a06-bcbf-b453ec38ea52)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.0775] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.0815] manager[0x55b89a5ee000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 16 13:00:22 compute-1 systemd[1]: Starting Hostname Service...
Feb 16 13:00:22 compute-1 systemd[1]: Started Hostname Service.
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1300] hostname: hostname: using hostnamed
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1300] hostname: static hostname changed from (none) to "compute-1"
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1305] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1308] manager[0x55b89a5ee000]: rfkill: Wi-Fi hardware radio set enabled
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1308] manager[0x55b89a5ee000]: rfkill: WWAN hardware radio set enabled
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1327] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1336] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1337] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1337] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1338] manager: Networking is enabled by state file
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1340] settings: Loaded settings plugin: keyfile (internal)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1343] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1371] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1382] dhcp: init: Using DHCP client 'internal'
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1385] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1390] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1394] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1401] device (lo): Activation: starting connection 'lo' (1e10248c-d525-48d3-b66b-d34bc8862c9f)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1407] device (eth0): carrier: link connected
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1409] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1414] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1414] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1420] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1427] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1431] device (eth1): carrier: link connected
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1434] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1439] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (648a3a79-0232-5a2f-bab7-8580a0ffce3b) (indicated)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1439] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1444] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1453] device (eth1): Activation: starting connection 'ci-private-network' (648a3a79-0232-5a2f-bab7-8580a0ffce3b)
Feb 16 13:00:22 compute-1 systemd[1]: Started Network Manager.
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1459] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1472] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1474] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1476] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1477] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1479] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1481] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1482] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1484] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1488] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1490] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1502] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1512] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1522] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1524] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1527] device (lo): Activation: successful, device activated.
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1535] dhcp4 (eth0): state changed new lease, address=38.102.83.251
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1542] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1853] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1859] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1866] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1869] manager: NetworkManager state is now CONNECTED_LOCAL
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1873] device (eth1): Activation: successful, device activated.
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1891] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1893] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1895] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1897] device (eth0): Activation: successful, device activated.
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1903] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 16 13:00:22 compute-1 NetworkManager[56388]: <info>  [1771246822.1905] manager: startup complete
Feb 16 13:00:22 compute-1 sudo[56370]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:22 compute-1 systemd[1]: Starting Network Manager Wait Online...
Feb 16 13:00:22 compute-1 systemd[1]: Finished Network Manager Wait Online.
Feb 16 13:00:22 compute-1 sudo[56596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgdmfqmulezqklslfmocuhobyxgzgycz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246822.416503-321-127416066514421/AnsiballZ_dnf.py'
Feb 16 13:00:22 compute-1 sudo[56596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:22 compute-1 python3.9[56598]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:00:32 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:00:32 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:00:32 compute-1 systemd[1]: Reloading.
Feb 16 13:00:32 compute-1 systemd-rc-local-generator[56650]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:00:32 compute-1 systemd-sysv-generator[56653]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:00:32 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 13:00:32 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:00:33 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:00:33 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:00:33 compute-1 systemd[1]: run-r53c1714be43b4728a155e7b3a094fd5e.service: Deactivated successfully.
Feb 16 13:00:33 compute-1 sudo[56596]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:33 compute-1 sudo[57066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swmtncrgobrjkzzjvyuhzerybvuczshm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246833.5844598-345-37651870701404/AnsiballZ_stat.py'
Feb 16 13:00:33 compute-1 sudo[57066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:34 compute-1 python3.9[57068]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:00:34 compute-1 sudo[57066]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:34 compute-1 sudo[57218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sofpxdudbikqdqpuuawalkidegxgwpgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246834.285546-363-57844051173136/AnsiballZ_ini_file.py'
Feb 16 13:00:34 compute-1 sudo[57218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:34 compute-1 python3.9[57220]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:34 compute-1 sudo[57218]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:35 compute-1 sudo[57372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujzffinidqailbetavzymuvqdukxddxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246835.1383286-383-173546096699240/AnsiballZ_ini_file.py'
Feb 16 13:00:35 compute-1 sudo[57372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:35 compute-1 python3.9[57374]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:35 compute-1 sudo[57372]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:36 compute-1 sudo[57524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhoyfmhkthinsusxymfbbueprieppviv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246835.7747188-383-269580528787431/AnsiballZ_ini_file.py'
Feb 16 13:00:36 compute-1 sudo[57524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:36 compute-1 python3.9[57526]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:36 compute-1 sudo[57524]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:36 compute-1 sudo[57676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdrjvzvkbocczhhszxshzvemmqaywcfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246836.5976782-413-255881608174978/AnsiballZ_ini_file.py'
Feb 16 13:00:36 compute-1 sudo[57676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:36 compute-1 python3.9[57678]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:36 compute-1 sudo[57676]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:37 compute-1 sudo[57828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfqzczjqrjfyaoheoeyhdxkkrabgbrcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246837.1130347-413-183151503376386/AnsiballZ_ini_file.py'
Feb 16 13:00:37 compute-1 sudo[57828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:37 compute-1 python3.9[57830]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:37 compute-1 sudo[57828]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:37 compute-1 sudo[57980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftwruwsmblbafkaysjxxbwymxyuuvfss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246837.770394-443-240909399953953/AnsiballZ_stat.py'
Feb 16 13:00:37 compute-1 sudo[57980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:38 compute-1 python3.9[57982]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:00:38 compute-1 sudo[57980]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:38 compute-1 sudo[58103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwujtyncyxulksipkvcgkhmybahuukef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246837.770394-443-240909399953953/AnsiballZ_copy.py'
Feb 16 13:00:38 compute-1 sudo[58103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:38 compute-1 python3.9[58105]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246837.770394-443-240909399953953/.source _original_basename=.6flhmsym follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:38 compute-1 sudo[58103]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:39 compute-1 sudo[58255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsjtcfmkycoorlsigthodnzeodcjnlup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246838.9889011-473-5887026869733/AnsiballZ_file.py'
Feb 16 13:00:39 compute-1 sudo[58255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:39 compute-1 python3.9[58257]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:39 compute-1 sudo[58255]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:40 compute-1 sudo[58407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmjpnllfymzydzotyqnuvceqvzgaobfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246839.685017-489-52177271131927/AnsiballZ_edpm_os_net_config_mappings.py'
Feb 16 13:00:40 compute-1 sudo[58407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:40 compute-1 python3.9[58409]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 16 13:00:40 compute-1 sudo[58407]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:40 compute-1 sudo[58559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvjxlqirpfnlxyljiapwbprfrdobpjja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246840.49906-507-58207908642050/AnsiballZ_file.py'
Feb 16 13:00:40 compute-1 sudo[58559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:40 compute-1 python3.9[58561]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:40 compute-1 sudo[58559]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:41 compute-1 sudo[58711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjwbbksndmhdwiqqfmgwpptvvwkksesw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246841.2385297-527-32285141665851/AnsiballZ_stat.py'
Feb 16 13:00:41 compute-1 sudo[58711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:41 compute-1 sudo[58711]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:41 compute-1 sudo[58834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbkmqjjzgwycayjqqlesgvrhwytuayjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246841.2385297-527-32285141665851/AnsiballZ_copy.py'
Feb 16 13:00:41 compute-1 sudo[58834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:42 compute-1 sudo[58834]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:42 compute-1 sudo[58986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abpdpzqsohqfpdczbhnlojikstvxjwfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246842.425655-557-177392914609950/AnsiballZ_slurp.py'
Feb 16 13:00:42 compute-1 sudo[58986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:42 compute-1 python3.9[58988]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 16 13:00:42 compute-1 sudo[58986]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:43 compute-1 sudo[59161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfhdnaqiscuoolizzwgiwujkzduihdyl ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246843.3249748-575-15578743521420/async_wrapper.py j539062631809 300 /home/zuul/.ansible/tmp/ansible-tmp-1771246843.3249748-575-15578743521420/AnsiballZ_edpm_os_net_config.py _'
Feb 16 13:00:43 compute-1 sudo[59161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:44 compute-1 ansible-async_wrapper.py[59163]: Invoked with j539062631809 300 /home/zuul/.ansible/tmp/ansible-tmp-1771246843.3249748-575-15578743521420/AnsiballZ_edpm_os_net_config.py _
Feb 16 13:00:44 compute-1 ansible-async_wrapper.py[59166]: Starting module and watcher
Feb 16 13:00:44 compute-1 ansible-async_wrapper.py[59166]: Start watching 59167 (300)
Feb 16 13:00:44 compute-1 ansible-async_wrapper.py[59167]: Start module (59167)
Feb 16 13:00:44 compute-1 ansible-async_wrapper.py[59163]: Return async_wrapper task started.
Feb 16 13:00:44 compute-1 sudo[59161]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:44 compute-1 python3.9[59168]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=True purge_provider=
Feb 16 13:00:44 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 16 13:00:44 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 16 13:00:44 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb 16 13:00:44 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 16 13:00:44 compute-1 kernel: cfg80211: failed to load regulatory.db
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.8873] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.8889] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9359] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9361] audit: op="connection-add" uuid="08e1c53a-b78e-45ee-806c-db7b727f66c8" name="br-ex-br" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9375] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9376] audit: op="connection-add" uuid="eeb5dd89-eb60-40d0-9533-e284810b7aee" name="br-ex-port" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9391] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9392] audit: op="connection-add" uuid="af38d693-d2d7-49da-8a16-1e6cd8e728d6" name="eth1-port" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9405] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9406] audit: op="connection-add" uuid="7354697d-d41e-4761-85cc-f839bcc97da5" name="vlan20-port" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9418] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9419] audit: op="connection-add" uuid="54f63c3a-2c74-40f2-836c-327361bc3a9a" name="vlan21-port" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9430] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9432] audit: op="connection-add" uuid="c495b78d-95d4-4739-b607-03f9064cf75f" name="vlan22-port" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9450] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9466] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9470] audit: op="connection-add" uuid="381ade61-0b82-4f1a-a193-f1a5c38c2457" name="br-ex-if" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9531] audit: op="connection-update" uuid="648a3a79-0232-5a2f-bab7-8580a0ffce3b" name="ci-private-network" args="ovs-external-ids.data,connection.master,connection.port-type,connection.slave-type,connection.controller,connection.timestamp,ipv6.dns,ipv6.method,ipv6.routing-rules,ipv6.routes,ipv6.addr-gen-mode,ipv6.addresses,ovs-interface.type,ipv4.dns,ipv4.method,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ipv4.addresses" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9548] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9550] audit: op="connection-add" uuid="5f974cbc-8abc-4069-b72d-b7bdaa8a02cb" name="vlan20-if" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9563] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9565] audit: op="connection-add" uuid="46084d4e-8a92-47cf-88eb-c3835c0d8ec1" name="vlan21-if" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9579] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9581] audit: op="connection-add" uuid="8161444f-281f-4b96-a332-ecd4eb3bc9cb" name="vlan22-if" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9592] audit: op="connection-delete" uuid="044b5d65-36c8-3a82-aeee-bbb48ac5f904" name="Wired connection 1" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9603] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9605] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9611] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9615] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (08e1c53a-b78e-45ee-806c-db7b727f66c8)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9616] audit: op="connection-activate" uuid="08e1c53a-b78e-45ee-806c-db7b727f66c8" name="br-ex-br" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9618] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9619] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9623] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9626] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (eeb5dd89-eb60-40d0-9533-e284810b7aee)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9628] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9629] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9633] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9636] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (af38d693-d2d7-49da-8a16-1e6cd8e728d6)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9638] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9639] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9644] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9648] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (7354697d-d41e-4761-85cc-f839bcc97da5)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9650] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9651] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9656] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9661] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (54f63c3a-2c74-40f2-836c-327361bc3a9a)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9663] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9664] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9670] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9673] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (c495b78d-95d4-4739-b607-03f9064cf75f)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9674] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9676] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9678] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9684] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9685] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9687] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9692] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (381ade61-0b82-4f1a-a193-f1a5c38c2457)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9692] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9695] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9696] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9697] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9699] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9709] device (eth1): disconnecting for new activation request.
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9710] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9714] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9716] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9717] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9721] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9722] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9725] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9730] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (5f974cbc-8abc-4069-b72d-b7bdaa8a02cb)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9731] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9734] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9737] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9738] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9741] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9742] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9746] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9751] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (46084d4e-8a92-47cf-88eb-c3835c0d8ec1)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9752] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9755] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9757] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9759] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9762] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <warn>  [1771246845.9763] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9767] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9772] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (8161444f-281f-4b96-a332-ecd4eb3bc9cb)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9774] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9777] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9780] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9781] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9783] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9796] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=59169 uid=0 result="success"
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9798] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9802] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9804] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9811] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9815] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9819] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9822] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9823] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 kernel: ovs-system: entered promiscuous mode
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9839] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9844] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 kernel: Timeout policy base is empty
Feb 16 13:00:45 compute-1 systemd-udevd[59174]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9856] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9858] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9863] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9869] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9873] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9876] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9882] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9887] dhcp4 (eth0): canceled DHCP transaction
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9887] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9888] dhcp4 (eth0): state changed no lease
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9889] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9898] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb 16 13:00:45 compute-1 NetworkManager[56388]: <info>  [1771246845.9902] audit: op="device-reapply" interface="eth1" ifindex=3 pid=59169 uid=0 result="fail" reason="Device is not activated"
Feb 16 13:00:45 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 13:00:46 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0015] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0019] dhcp4 (eth0): state changed new lease, address=38.102.83.251
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0085] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 16 13:00:46 compute-1 kernel: br-ex: entered promiscuous mode
Feb 16 13:00:46 compute-1 kernel: vlan21: entered promiscuous mode
Feb 16 13:00:46 compute-1 systemd-udevd[59173]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:00:46 compute-1 kernel: vlan20: entered promiscuous mode
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0228] device (eth1): Activation: starting connection 'ci-private-network' (648a3a79-0232-5a2f-bab7-8580a0ffce3b)
Feb 16 13:00:46 compute-1 systemd-udevd[59262]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0236] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0250] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0254] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0267] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0278] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0281] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0283] device (eth1): released from controller device eth1
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0288] device (eth1): disconnecting for new activation request.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0289] audit: op="connection-activate" uuid="648a3a79-0232-5a2f-bab7-8580a0ffce3b" name="ci-private-network" pid=59169 uid=0 result="success"
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0289] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0290] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0292] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0293] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0293] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0300] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0306] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0308] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0311] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0315] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 kernel: vlan22: entered promiscuous mode
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0328] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0331] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0334] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0337] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0341] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0344] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0349] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0352] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0356] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0361] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0367] device (eth1): Activation: starting connection 'ci-private-network' (648a3a79-0232-5a2f-bab7-8580a0ffce3b)
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0386] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0389] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0394] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0394] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59169 uid=0 result="success"
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0397] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0408] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0415] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0424] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0441] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0445] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0449] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0456] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0457] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0460] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0469] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0474] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0478] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0489] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0490] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0494] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0498] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0503] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0508] device (eth1): Activation: successful, device activated.
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0517] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0573] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0574] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 13:00:46 compute-1 NetworkManager[56388]: <info>  [1771246846.0578] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.1877] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59169 uid=0 result="success"
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.3364] checkpoint[0x55b89a5c3950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.3368] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59169 uid=0 result="success"
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.5727] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59169 uid=0 result="success"
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.5738] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59169 uid=0 result="success"
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.7407] audit: op="networking-control" arg="global-dns-configuration" pid=59169 uid=0 result="success"
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.7437] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.7477] audit: op="networking-control" arg="global-dns-configuration" pid=59169 uid=0 result="success"
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.7512] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59169 uid=0 result="success"
Feb 16 13:00:47 compute-1 sudo[59511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaudvufeayjdypkxwxdwdbatyzxhjlbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246847.3094776-575-163838554033312/AnsiballZ_async_status.py'
Feb 16 13:00:47 compute-1 sudo[59511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.8582] checkpoint[0x55b89a5c3a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb 16 13:00:47 compute-1 NetworkManager[56388]: <info>  [1771246847.8587] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59169 uid=0 result="success"
Feb 16 13:00:47 compute-1 ansible-async_wrapper.py[59167]: Module complete (59167)
Feb 16 13:00:48 compute-1 python3.9[59513]: ansible-ansible.legacy.async_status Invoked with jid=j539062631809.59163 mode=status _async_dir=/root/.ansible_async
Feb 16 13:00:48 compute-1 sudo[59511]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:48 compute-1 sudo[59611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqwsmyxmncnpcgrnfedhrhxgvslsexir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246847.3094776-575-163838554033312/AnsiballZ_async_status.py'
Feb 16 13:00:48 compute-1 sudo[59611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:48 compute-1 python3.9[59613]: ansible-ansible.legacy.async_status Invoked with jid=j539062631809.59163 mode=cleanup _async_dir=/root/.ansible_async
Feb 16 13:00:48 compute-1 sudo[59611]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:49 compute-1 ansible-async_wrapper.py[59166]: Done in kid B.
Feb 16 13:00:49 compute-1 sudo[59763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuzkdbrvouzwyinxhmwikdgqgxxnfgdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246848.88832-619-138758286743745/AnsiballZ_stat.py'
Feb 16 13:00:49 compute-1 sudo[59763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:49 compute-1 python3.9[59765]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:00:49 compute-1 sudo[59763]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:49 compute-1 sudo[59886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbadrpsdhmknpsvmuktooaeaqoswhnva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246848.88832-619-138758286743745/AnsiballZ_copy.py'
Feb 16 13:00:49 compute-1 sudo[59886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:49 compute-1 python3.9[59888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246848.88832-619-138758286743745/.source.returncode _original_basename=.tb9_7fjf follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:49 compute-1 sudo[59886]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:50 compute-1 sudo[60038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxgpezhhdgxpoehtbvbydljyrvetqmbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246850.136598-651-259597532479184/AnsiballZ_stat.py'
Feb 16 13:00:50 compute-1 sudo[60038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:50 compute-1 python3.9[60040]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:00:50 compute-1 sudo[60038]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:50 compute-1 sudo[60163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysgwlhtsvwnfwrtmelwszkposnnozlky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246850.136598-651-259597532479184/AnsiballZ_copy.py'
Feb 16 13:00:50 compute-1 sudo[60163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:50 compute-1 sshd-session[60041]: Connection closed by authenticating user root 142.93.238.36 port 59932 [preauth]
Feb 16 13:00:51 compute-1 python3.9[60165]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246850.136598-651-259597532479184/.source.cfg _original_basename=.mx42p4g5 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:00:51 compute-1 sudo[60163]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:51 compute-1 sudo[60316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-supohhgircfbgvbscnrhdrlzqzystgdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246851.2918057-681-255020335467895/AnsiballZ_systemd.py'
Feb 16 13:00:51 compute-1 sudo[60316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:00:51 compute-1 python3.9[60318]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:00:51 compute-1 systemd[1]: Reloading Network Manager...
Feb 16 13:00:51 compute-1 NetworkManager[56388]: <info>  [1771246851.9174] audit: op="reload" arg="0" pid=60322 uid=0 result="success"
Feb 16 13:00:51 compute-1 NetworkManager[56388]: <info>  [1771246851.9185] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb 16 13:00:51 compute-1 systemd[1]: Reloaded Network Manager.
Feb 16 13:00:51 compute-1 sudo[60316]: pam_unix(sudo:session): session closed for user root
Feb 16 13:00:52 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 13:00:52 compute-1 sshd-session[52361]: Connection closed by 192.168.122.30 port 33228
Feb 16 13:00:52 compute-1 sshd-session[52358]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:00:52 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Feb 16 13:00:52 compute-1 systemd[1]: session-13.scope: Consumed 44.731s CPU time.
Feb 16 13:00:52 compute-1 systemd-logind[821]: Session 13 logged out. Waiting for processes to exit.
Feb 16 13:00:52 compute-1 systemd-logind[821]: Removed session 13.
Feb 16 13:00:58 compute-1 sshd-session[60355]: Accepted publickey for zuul from 192.168.122.30 port 45652 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:00:58 compute-1 systemd-logind[821]: New session 14 of user zuul.
Feb 16 13:00:58 compute-1 systemd[1]: Started Session 14 of User zuul.
Feb 16 13:00:58 compute-1 sshd-session[60355]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:00:59 compute-1 python3.9[60508]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:01 compute-1 CROND[60665]: (root) CMD (run-parts /etc/cron.hourly)
Feb 16 13:01:01 compute-1 run-parts[60668]: (/etc/cron.hourly) starting 0anacron
Feb 16 13:01:01 compute-1 anacron[60676]: Anacron started on 2026-02-16
Feb 16 13:01:01 compute-1 anacron[60676]: Will run job `cron.daily' in 31 min.
Feb 16 13:01:01 compute-1 anacron[60676]: Will run job `cron.weekly' in 51 min.
Feb 16 13:01:01 compute-1 anacron[60676]: Will run job `cron.monthly' in 71 min.
Feb 16 13:01:01 compute-1 anacron[60676]: Jobs will be executed sequentially
Feb 16 13:01:01 compute-1 run-parts[60678]: (/etc/cron.hourly) finished 0anacron
Feb 16 13:01:01 compute-1 CROND[60664]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 16 13:01:01 compute-1 python3.9[60663]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:01 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 13:01:02 compute-1 python3.9[60868]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:01:02 compute-1 sshd-session[60358]: Connection closed by 192.168.122.30 port 45652
Feb 16 13:01:02 compute-1 sshd-session[60355]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:01:02 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Feb 16 13:01:02 compute-1 systemd[1]: session-14.scope: Consumed 1.852s CPU time.
Feb 16 13:01:02 compute-1 systemd-logind[821]: Session 14 logged out. Waiting for processes to exit.
Feb 16 13:01:02 compute-1 systemd-logind[821]: Removed session 14.
Feb 16 13:01:08 compute-1 sshd-session[60896]: Accepted publickey for zuul from 192.168.122.30 port 54834 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:01:08 compute-1 systemd-logind[821]: New session 15 of user zuul.
Feb 16 13:01:08 compute-1 systemd[1]: Started Session 15 of User zuul.
Feb 16 13:01:08 compute-1 sshd-session[60896]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:01:09 compute-1 python3.9[61049]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:10 compute-1 python3.9[61203]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:10 compute-1 sudo[61358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmgqjxvpzzvprsidsnlqbrkhdmnlglzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246870.5891507-61-44338004890036/AnsiballZ_setup.py'
Feb 16 13:01:10 compute-1 sudo[61358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:11 compute-1 python3.9[61360]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:11 compute-1 sudo[61358]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:11 compute-1 sudo[61442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywumgdgdablpzplkrhtredavbnxmcrhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246870.5891507-61-44338004890036/AnsiballZ_dnf.py'
Feb 16 13:01:11 compute-1 sudo[61442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:11 compute-1 python3.9[61444]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:01:13 compute-1 sudo[61442]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:13 compute-1 sudo[61595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbdozmqizmdvgxqnpyyizrthwxwjfmyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246873.4250102-85-226978884340092/AnsiballZ_setup.py'
Feb 16 13:01:13 compute-1 sudo[61595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:13 compute-1 python3.9[61597]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:14 compute-1 sudo[61595]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:14 compute-1 sudo[61787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crhnxisqjsjxlzhhrqykafbfzwtkkygu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246874.4634433-107-26054617771621/AnsiballZ_file.py'
Feb 16 13:01:14 compute-1 sudo[61787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:15 compute-1 python3.9[61789]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:15 compute-1 sudo[61787]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:16 compute-1 sudo[61939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kacpmpsnvtkbgionzmvbrsrrvrrszwnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246875.9919765-123-86564774536018/AnsiballZ_command.py'
Feb 16 13:01:16 compute-1 sudo[61939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:16 compute-1 python3.9[61941]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:01:16 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 13:01:16 compute-1 sudo[61939]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:17 compute-1 sudo[62103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymjrailavcbgowgmjaglcmnrfurgxuth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246876.8769853-139-142338265263520/AnsiballZ_stat.py'
Feb 16 13:01:17 compute-1 sudo[62103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:17 compute-1 python3.9[62105]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:17 compute-1 sudo[62103]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:17 compute-1 sudo[62181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftptnhvuksddvcnmruvkpyuqtleelkes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246876.8769853-139-142338265263520/AnsiballZ_file.py'
Feb 16 13:01:17 compute-1 sudo[62181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:17 compute-1 python3.9[62183]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:17 compute-1 sudo[62181]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:18 compute-1 sudo[62333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egbvfjnwvfogpiybnkgyjbqyvvjrbbba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246878.1206558-163-32511303767653/AnsiballZ_stat.py'
Feb 16 13:01:18 compute-1 sudo[62333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:18 compute-1 python3.9[62335]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:18 compute-1 sudo[62333]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:18 compute-1 sudo[62411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-capsvxnrbjhfmpualzaxggcpniyqeogn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246878.1206558-163-32511303767653/AnsiballZ_file.py'
Feb 16 13:01:19 compute-1 sudo[62411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:19 compute-1 python3.9[62413]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:19 compute-1 sudo[62411]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:20 compute-1 sudo[62563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxgjkbshvqvlbywomkphjksusmshokgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246879.7710931-189-216982209180208/AnsiballZ_ini_file.py'
Feb 16 13:01:20 compute-1 sudo[62563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:20 compute-1 python3.9[62565]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:20 compute-1 sudo[62563]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:20 compute-1 sudo[62715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbprfmduzxcvflotwlahcqktfmfitsmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246880.5007496-189-117974259119928/AnsiballZ_ini_file.py'
Feb 16 13:01:20 compute-1 sudo[62715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:20 compute-1 python3.9[62717]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:20 compute-1 sudo[62715]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:21 compute-1 sudo[62867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sukdnajclvnikswgwnqjonlxvrgjvria ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246881.0580788-189-65030368415168/AnsiballZ_ini_file.py'
Feb 16 13:01:21 compute-1 sudo[62867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:21 compute-1 python3.9[62869]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:21 compute-1 sudo[62867]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:21 compute-1 sudo[63019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmsacuzauqrwhwtmwghcagllqmvuxfya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246881.6098409-189-220845255053040/AnsiballZ_ini_file.py'
Feb 16 13:01:21 compute-1 sudo[63019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:22 compute-1 python3.9[63021]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:22 compute-1 sudo[63019]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:22 compute-1 sudo[63171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcsrofousburenwazxmrxjymgxqxajuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246882.3750763-251-23495568881409/AnsiballZ_dnf.py'
Feb 16 13:01:22 compute-1 sudo[63171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:22 compute-1 python3.9[63173]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:01:22 compute-1 sshd-session[63174]: Connection closed by 2.57.122.210 port 47342
Feb 16 13:01:24 compute-1 sudo[63171]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:24 compute-1 sudo[63325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyxpcmqorqqdshrenalsureyaifolktn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246884.6134133-273-6912314131785/AnsiballZ_setup.py'
Feb 16 13:01:24 compute-1 sudo[63325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:25 compute-1 python3.9[63327]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:25 compute-1 sudo[63325]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:25 compute-1 sudo[63479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpctaobyyzovocsdxuljardepmndelwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246885.3212879-289-118758290180323/AnsiballZ_stat.py'
Feb 16 13:01:25 compute-1 sudo[63479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:25 compute-1 python3.9[63481]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:01:25 compute-1 sudo[63479]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:26 compute-1 sudo[63631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwichhkcvhyzixgojakeqioegamtoorm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246886.0210426-307-167879299780088/AnsiballZ_stat.py'
Feb 16 13:01:26 compute-1 sudo[63631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:26 compute-1 python3.9[63633]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:01:26 compute-1 sudo[63631]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:27 compute-1 sudo[63783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbbaulimrcnrblxwsshkkebohvbmlcya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246886.7640367-327-273036069160219/AnsiballZ_command.py'
Feb 16 13:01:27 compute-1 sudo[63783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:27 compute-1 python3.9[63785]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:01:27 compute-1 sudo[63783]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:28 compute-1 sudo[63936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzvntmbnpdbaormizuosiqfdudkfjsgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246887.4876077-347-197488216321634/AnsiballZ_service_facts.py'
Feb 16 13:01:28 compute-1 sudo[63936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:28 compute-1 python3.9[63938]: ansible-service_facts Invoked
Feb 16 13:01:28 compute-1 network[63955]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:01:28 compute-1 network[63956]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:01:28 compute-1 network[63957]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:01:31 compute-1 sudo[63936]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:32 compute-1 sudo[64241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yotscjrmlvgwxjkejmkotkujydfngver ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1771246892.2639272-377-176221155807683/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1771246892.2639272-377-176221155807683/args'
Feb 16 13:01:32 compute-1 sudo[64241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:32 compute-1 sudo[64241]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:33 compute-1 sudo[64408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auapcyenxofrkcnbizudztlaipkzsceq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246892.9884489-399-241420796366673/AnsiballZ_dnf.py'
Feb 16 13:01:33 compute-1 sudo[64408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:33 compute-1 python3.9[64410]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:01:35 compute-1 sudo[64408]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:36 compute-1 sudo[64561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdrwjqtrrkzwtnrggalkkdjdglvhawdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246895.562665-426-85635440923701/AnsiballZ_package_facts.py'
Feb 16 13:01:36 compute-1 sudo[64561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:36 compute-1 python3.9[64563]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 16 13:01:36 compute-1 sudo[64561]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:37 compute-1 sudo[64713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efurldkwflrrbdyrdlxllemyatclrwwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246897.3742175-445-143361452078568/AnsiballZ_stat.py'
Feb 16 13:01:37 compute-1 sudo[64713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:37 compute-1 python3.9[64715]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:37 compute-1 sudo[64713]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:38 compute-1 sudo[64838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxaikmmdazlhqhmwoattfufoobskouao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246897.3742175-445-143361452078568/AnsiballZ_copy.py'
Feb 16 13:01:38 compute-1 sudo[64838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:38 compute-1 python3.9[64840]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246897.3742175-445-143361452078568/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:38 compute-1 sudo[64838]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:39 compute-1 sudo[64992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfvroaelejanyacnpaqpezzrztgjzisr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246898.8474994-475-162891940016802/AnsiballZ_stat.py'
Feb 16 13:01:39 compute-1 sudo[64992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:39 compute-1 python3.9[64994]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:39 compute-1 sudo[64992]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:39 compute-1 sudo[65117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuopwwvgzykxenvwnbvrxkysrnmnebkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246898.8474994-475-162891940016802/AnsiballZ_copy.py'
Feb 16 13:01:39 compute-1 sudo[65117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:39 compute-1 python3.9[65119]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246898.8474994-475-162891940016802/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:39 compute-1 sudo[65117]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:40 compute-1 sudo[65271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgkuglmpmsebzwqumztvcvdtfloujibc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246900.5976238-518-240815317571574/AnsiballZ_lineinfile.py'
Feb 16 13:01:40 compute-1 sudo[65271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:41 compute-1 python3.9[65273]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:41 compute-1 sudo[65271]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:42 compute-1 sudo[65425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpwkbdnoriuqdemxwbgzcicmfpbxbpfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246901.9728858-548-29323251315044/AnsiballZ_setup.py'
Feb 16 13:01:42 compute-1 sudo[65425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:42 compute-1 python3.9[65427]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:42 compute-1 sudo[65425]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:43 compute-1 sudo[65510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxjfdvvslpswbcnwjpadfagnmacfaqgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246901.9728858-548-29323251315044/AnsiballZ_systemd.py'
Feb 16 13:01:43 compute-1 sudo[65510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:43 compute-1 python3.9[65512]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:01:43 compute-1 sudo[65510]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:44 compute-1 sudo[65664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpprafmmpctmtgtdjvsrbfagylbtotrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246904.315504-580-170939386557434/AnsiballZ_setup.py'
Feb 16 13:01:44 compute-1 sudo[65664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:44 compute-1 python3.9[65666]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:01:45 compute-1 sudo[65664]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:45 compute-1 sudo[65748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvqrcyqywaazwfzqljmqnyaicxhlbnkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246904.315504-580-170939386557434/AnsiballZ_systemd.py'
Feb 16 13:01:45 compute-1 sudo[65748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:45 compute-1 python3.9[65750]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:01:45 compute-1 chronyd[801]: chronyd exiting
Feb 16 13:01:45 compute-1 systemd[1]: Stopping NTP client/server...
Feb 16 13:01:45 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Feb 16 13:01:45 compute-1 systemd[1]: Stopped NTP client/server.
Feb 16 13:01:45 compute-1 systemd[1]: Starting NTP client/server...
Feb 16 13:01:45 compute-1 chronyd[65758]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 16 13:01:45 compute-1 chronyd[65758]: Frequency -26.831 +/- 0.300 ppm read from /var/lib/chrony/drift
Feb 16 13:01:45 compute-1 chronyd[65758]: Loaded seccomp filter (level 2)
Feb 16 13:01:45 compute-1 systemd[1]: Started NTP client/server.
Feb 16 13:01:45 compute-1 sudo[65748]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:46 compute-1 sshd-session[60899]: Connection closed by 192.168.122.30 port 54834
Feb 16 13:01:46 compute-1 sshd-session[60896]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:01:46 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Feb 16 13:01:46 compute-1 systemd[1]: session-15.scope: Consumed 22.185s CPU time.
Feb 16 13:01:46 compute-1 systemd-logind[821]: Session 15 logged out. Waiting for processes to exit.
Feb 16 13:01:46 compute-1 systemd-logind[821]: Removed session 15.
Feb 16 13:01:51 compute-1 sshd-session[65784]: Accepted publickey for zuul from 192.168.122.30 port 37648 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:01:51 compute-1 systemd-logind[821]: New session 16 of user zuul.
Feb 16 13:01:51 compute-1 systemd[1]: Started Session 16 of User zuul.
Feb 16 13:01:51 compute-1 sshd-session[65784]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:01:52 compute-1 python3.9[65937]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:01:53 compute-1 sudo[66093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhdyrtzdkmgdkqzlmkbkoiejqrxlrdhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246913.473772-47-146721773987110/AnsiballZ_file.py'
Feb 16 13:01:53 compute-1 sudo[66093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:54 compute-1 python3.9[66095]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:54 compute-1 sudo[66093]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:54 compute-1 sshd-session[65966]: Connection closed by authenticating user root 142.93.238.36 port 39006 [preauth]
Feb 16 13:01:54 compute-1 sudo[66268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsuxhsqzsxjnngvzcnsgmxnfsnagkqky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246914.330941-63-38532204374529/AnsiballZ_stat.py'
Feb 16 13:01:54 compute-1 sudo[66268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:54 compute-1 python3.9[66270]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:55 compute-1 sudo[66268]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:55 compute-1 sudo[66346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kelupasxkewggyagljhvffcgyiqdecjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246914.330941-63-38532204374529/AnsiballZ_file.py'
Feb 16 13:01:55 compute-1 sudo[66346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:55 compute-1 python3.9[66348]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.ay8c7pg4 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:55 compute-1 sudo[66346]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:56 compute-1 sudo[66498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsnimeqouhasovipmzqxvztjyvpgrfva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246915.8113275-103-73420932851760/AnsiballZ_stat.py'
Feb 16 13:01:56 compute-1 sudo[66498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:56 compute-1 python3.9[66500]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:56 compute-1 sudo[66498]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:56 compute-1 sudo[66621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbxewlbtiwmorihuprkywqosllgttwxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246915.8113275-103-73420932851760/AnsiballZ_copy.py'
Feb 16 13:01:56 compute-1 sudo[66621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:56 compute-1 python3.9[66623]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246915.8113275-103-73420932851760/.source _original_basename=.y2ah45m1 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:01:56 compute-1 sudo[66621]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:57 compute-1 sudo[66773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyanfwwrwwstbrgzpjgmqlkrtzyawapb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246917.053073-135-138361566705443/AnsiballZ_file.py'
Feb 16 13:01:57 compute-1 sudo[66773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:57 compute-1 python3.9[66775]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:57 compute-1 sudo[66773]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:57 compute-1 sudo[66925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrkeyssybpqzkwphgicjdzcbjjecwmyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246917.6875718-151-71273579097606/AnsiballZ_stat.py'
Feb 16 13:01:57 compute-1 sudo[66925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:58 compute-1 python3.9[66927]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:58 compute-1 sudo[66925]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:58 compute-1 sudo[67048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzjolghindxpacxwayalnadroxgkqfkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246917.6875718-151-71273579097606/AnsiballZ_copy.py'
Feb 16 13:01:58 compute-1 sudo[67048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:58 compute-1 python3.9[67050]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246917.6875718-151-71273579097606/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:58 compute-1 sudo[67048]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:58 compute-1 sudo[67200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciwhlhcpjebxflvisnhaqcglzrttsaia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246918.7275512-151-172921418430646/AnsiballZ_stat.py'
Feb 16 13:01:58 compute-1 sudo[67200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:59 compute-1 python3.9[67202]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:01:59 compute-1 sudo[67200]: pam_unix(sudo:session): session closed for user root
Feb 16 13:01:59 compute-1 sudo[67323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nonakizksbuzusqczxtxmdvimlaludqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246918.7275512-151-172921418430646/AnsiballZ_copy.py'
Feb 16 13:01:59 compute-1 sudo[67323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:01:59 compute-1 python3.9[67325]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771246918.7275512-151-172921418430646/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:01:59 compute-1 sudo[67323]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:00 compute-1 sudo[67475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgsdffylsxgdywetatvxocxsqfcmixyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246919.8565948-209-207612998924366/AnsiballZ_file.py'
Feb 16 13:02:00 compute-1 sudo[67475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:00 compute-1 python3.9[67477]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:00 compute-1 sudo[67475]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:00 compute-1 sudo[67627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgqkuuaaytrrbhdbsubzfczgytynvoar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246920.3976476-225-6171315584522/AnsiballZ_stat.py'
Feb 16 13:02:00 compute-1 sudo[67627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:00 compute-1 python3.9[67629]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:00 compute-1 sudo[67627]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:01 compute-1 sudo[67750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywpmciznexwbiwlnatzumpzgmpagllns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246920.3976476-225-6171315584522/AnsiballZ_copy.py'
Feb 16 13:02:01 compute-1 sudo[67750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:01 compute-1 python3.9[67752]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246920.3976476-225-6171315584522/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:01 compute-1 sudo[67750]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:01 compute-1 sudo[67902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvhwpsdzygujtqblqnduyfxzhwcrfiyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246921.5433588-255-233225055188903/AnsiballZ_stat.py'
Feb 16 13:02:01 compute-1 sudo[67902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:01 compute-1 python3.9[67904]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:01 compute-1 sudo[67902]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:02 compute-1 sudo[68025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edsuwwerwagjbzchniepozsojcuujdhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246921.5433588-255-233225055188903/AnsiballZ_copy.py'
Feb 16 13:02:02 compute-1 sudo[68025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:02 compute-1 python3.9[68027]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246921.5433588-255-233225055188903/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:02 compute-1 sudo[68025]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:03 compute-1 sudo[68177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsaptyrdofnbozkidthxqhznfgpablvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246922.6509051-285-117109626826995/AnsiballZ_systemd.py'
Feb 16 13:02:03 compute-1 sudo[68177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:03 compute-1 python3.9[68179]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:03 compute-1 systemd[1]: Reloading.
Feb 16 13:02:03 compute-1 systemd-sysv-generator[68210]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:03 compute-1 systemd-rc-local-generator[68203]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:03 compute-1 systemd[1]: Reloading.
Feb 16 13:02:03 compute-1 systemd-rc-local-generator[68251]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:03 compute-1 systemd-sysv-generator[68254]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:03 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Feb 16 13:02:03 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Feb 16 13:02:03 compute-1 sudo[68177]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:04 compute-1 sudo[68418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnvvqdgvxiaqpjgrlrcaorsoczziznco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246924.1039777-301-18138116753585/AnsiballZ_stat.py'
Feb 16 13:02:04 compute-1 sudo[68418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:04 compute-1 python3.9[68420]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:04 compute-1 sudo[68418]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:04 compute-1 sudo[68541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqwnsckgyqygxmdxjqyueokgsunrmcoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246924.1039777-301-18138116753585/AnsiballZ_copy.py'
Feb 16 13:02:04 compute-1 sudo[68541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:04 compute-1 python3.9[68543]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246924.1039777-301-18138116753585/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:05 compute-1 sudo[68541]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:05 compute-1 sudo[68693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiclgrtevcgmharkysxaavwmympqxohg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246925.1840627-331-42282869587940/AnsiballZ_stat.py'
Feb 16 13:02:05 compute-1 sudo[68693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:05 compute-1 python3.9[68695]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:05 compute-1 sudo[68693]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:05 compute-1 sudo[68816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luwnsdnmfgwzakakndxwavsabpodtrsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246925.1840627-331-42282869587940/AnsiballZ_copy.py'
Feb 16 13:02:05 compute-1 sudo[68816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:06 compute-1 python3.9[68818]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246925.1840627-331-42282869587940/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:06 compute-1 sudo[68816]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:06 compute-1 sudo[68968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjhxuqxkvqezuwxriwjbdjqpqiuceteg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246926.2367659-361-214337375433781/AnsiballZ_systemd.py'
Feb 16 13:02:06 compute-1 sudo[68968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:06 compute-1 python3.9[68970]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:06 compute-1 systemd[1]: Reloading.
Feb 16 13:02:06 compute-1 systemd-sysv-generator[68996]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:06 compute-1 systemd-rc-local-generator[68993]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:07 compute-1 systemd[1]: Reloading.
Feb 16 13:02:07 compute-1 systemd-rc-local-generator[69043]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:07 compute-1 systemd-sysv-generator[69046]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:07 compute-1 systemd[1]: Starting Create netns directory...
Feb 16 13:02:07 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 16 13:02:07 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 16 13:02:07 compute-1 systemd[1]: Finished Create netns directory.
Feb 16 13:02:07 compute-1 sudo[68968]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:08 compute-1 python3.9[69210]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:02:08 compute-1 network[69227]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:02:08 compute-1 network[69228]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:02:08 compute-1 network[69229]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:02:11 compute-1 sudo[69490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qapisjvbrjhclfoiwqqxenqljrohctfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246931.251367-393-57556591524674/AnsiballZ_systemd.py'
Feb 16 13:02:11 compute-1 sudo[69490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:11 compute-1 python3.9[69492]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:11 compute-1 systemd[1]: Reloading.
Feb 16 13:02:11 compute-1 systemd-rc-local-generator[69520]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:11 compute-1 systemd-sysv-generator[69525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:12 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Feb 16 13:02:12 compute-1 iptables.init[69539]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb 16 13:02:12 compute-1 iptables.init[69539]: iptables: Flushing firewall rules: [  OK  ]
Feb 16 13:02:12 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Feb 16 13:02:12 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Feb 16 13:02:12 compute-1 sudo[69490]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:12 compute-1 sudo[69734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzzxfddignbiudoqkqyjwzijvlhagcth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246932.594741-393-221646777398066/AnsiballZ_systemd.py'
Feb 16 13:02:12 compute-1 sudo[69734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:13 compute-1 python3.9[69736]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:13 compute-1 sudo[69734]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:13 compute-1 sudo[69888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfvnhyjmktuspgmdqijbdazrprwimmba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246933.5413418-425-51432667592356/AnsiballZ_systemd.py'
Feb 16 13:02:13 compute-1 sudo[69888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:14 compute-1 python3.9[69890]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:02:14 compute-1 systemd[1]: Reloading.
Feb 16 13:02:14 compute-1 systemd-sysv-generator[69921]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:02:14 compute-1 systemd-rc-local-generator[69916]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:02:14 compute-1 systemd[1]: Starting Netfilter Tables...
Feb 16 13:02:14 compute-1 systemd[1]: Finished Netfilter Tables.
Feb 16 13:02:14 compute-1 sudo[69888]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:15 compute-1 sudo[70087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fckrhibmznnxgwfrwunikelvdybfprve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246934.7177866-441-120942453635909/AnsiballZ_command.py'
Feb 16 13:02:15 compute-1 sudo[70087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:15 compute-1 python3.9[70089]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:15 compute-1 sudo[70087]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:16 compute-1 sudo[70240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqzvncpwixvclczaxsfphmkhdwvnemwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246935.8549502-470-235462565353768/AnsiballZ_stat.py'
Feb 16 13:02:16 compute-1 sudo[70240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:16 compute-1 python3.9[70242]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:16 compute-1 sudo[70240]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:16 compute-1 sudo[70365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjzmapwhkntadcixfsffqgisioewdath ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246935.8549502-470-235462565353768/AnsiballZ_copy.py'
Feb 16 13:02:16 compute-1 sudo[70365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:16 compute-1 python3.9[70367]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246935.8549502-470-235462565353768/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:16 compute-1 sudo[70365]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:17 compute-1 sudo[70518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfycmpheqcrhbguxdvswbgrqitdgulne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246937.5800695-499-73097429534043/AnsiballZ_systemd.py'
Feb 16 13:02:17 compute-1 sudo[70518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:18 compute-1 python3.9[70520]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:02:18 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Feb 16 13:02:18 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Feb 16 13:02:18 compute-1 sshd[1017]: Received SIGHUP; restarting.
Feb 16 13:02:18 compute-1 sshd[1017]: Server listening on 0.0.0.0 port 22.
Feb 16 13:02:18 compute-1 sshd[1017]: Server listening on :: port 22.
Feb 16 13:02:18 compute-1 sudo[70518]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:18 compute-1 sudo[70674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcidqmlnkmrtutnsopmnqiofvzvhswii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246938.4555128-515-277363124879773/AnsiballZ_file.py'
Feb 16 13:02:18 compute-1 sudo[70674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:18 compute-1 python3.9[70676]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:18 compute-1 sudo[70674]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:19 compute-1 sudo[70826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfbcbblavvtkmqsucngtlqllkptkcsrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246939.0517619-531-218127645304146/AnsiballZ_stat.py'
Feb 16 13:02:19 compute-1 sudo[70826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:19 compute-1 python3.9[70828]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:19 compute-1 sudo[70826]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:19 compute-1 sudo[70949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzbgdxwunoololnanoqcmbwlzlglnxyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246939.0517619-531-218127645304146/AnsiballZ_copy.py'
Feb 16 13:02:19 compute-1 sudo[70949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:20 compute-1 python3.9[70951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246939.0517619-531-218127645304146/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:20 compute-1 sudo[70949]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:20 compute-1 sudo[71101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlydbvkqlbymvuzhlgocgukgrscgakxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246940.4737175-567-6313678541579/AnsiballZ_timezone.py'
Feb 16 13:02:20 compute-1 sudo[71101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:21 compute-1 python3.9[71103]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 16 13:02:21 compute-1 systemd[1]: Starting Time & Date Service...
Feb 16 13:02:21 compute-1 systemd[1]: Started Time & Date Service.
Feb 16 13:02:21 compute-1 sudo[71101]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:22 compute-1 sudo[71257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aywiieiaxzxdjrpocilfemxnpkaqhqup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246941.7801254-585-235329372606761/AnsiballZ_file.py'
Feb 16 13:02:22 compute-1 sudo[71257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:22 compute-1 python3.9[71259]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:22 compute-1 sudo[71257]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:23 compute-1 sudo[71409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgyhjlcwnoldiagarctvyejdklyngeon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246942.3469605-601-205916738118697/AnsiballZ_stat.py'
Feb 16 13:02:23 compute-1 sudo[71409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:23 compute-1 python3.9[71411]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:23 compute-1 sudo[71409]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:23 compute-1 sudo[71532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btdycrjbzvrzqdaujsuigymbflyofsvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246942.3469605-601-205916738118697/AnsiballZ_copy.py'
Feb 16 13:02:23 compute-1 sudo[71532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:23 compute-1 python3.9[71534]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246942.3469605-601-205916738118697/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:23 compute-1 sudo[71532]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:24 compute-1 sudo[71684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnghmeqjgccexxkjckqmfemrlihvmdfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246944.0404453-631-109130529881416/AnsiballZ_stat.py'
Feb 16 13:02:24 compute-1 sudo[71684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:24 compute-1 python3.9[71686]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:24 compute-1 sudo[71684]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:24 compute-1 sudo[71807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhzethsjdswjaqwybvtbzzqeuoljhrtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246944.0404453-631-109130529881416/AnsiballZ_copy.py'
Feb 16 13:02:24 compute-1 sudo[71807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:25 compute-1 python3.9[71809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771246944.0404453-631-109130529881416/.source.yaml _original_basename=.ygdclfll follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:25 compute-1 sudo[71807]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:25 compute-1 sudo[71959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjjravotvvjtklrnpkmjsaxtyczekpvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246945.2044535-661-44880444811891/AnsiballZ_stat.py'
Feb 16 13:02:25 compute-1 sudo[71959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:25 compute-1 python3.9[71961]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:25 compute-1 sudo[71959]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:25 compute-1 sudo[72082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxpleawepucmzvweuenizoymeovopahh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246945.2044535-661-44880444811891/AnsiballZ_copy.py'
Feb 16 13:02:25 compute-1 sudo[72082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:26 compute-1 python3.9[72084]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246945.2044535-661-44880444811891/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:26 compute-1 sudo[72082]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:26 compute-1 sudo[72234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szvwsybyqeynkeeoccbyjgjantdimwwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246946.3472066-691-33314655163984/AnsiballZ_command.py'
Feb 16 13:02:26 compute-1 sudo[72234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:26 compute-1 python3.9[72236]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:26 compute-1 sudo[72234]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:27 compute-1 sudo[72387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffqonidqrpoucpoqzgxbjjpowpfdkkpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246947.177777-707-193916597575513/AnsiballZ_command.py'
Feb 16 13:02:27 compute-1 sudo[72387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:27 compute-1 python3.9[72389]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:27 compute-1 sudo[72387]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:28 compute-1 sudo[72540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmlnxvbbelfgodnrmgpxsqadcjfyjwgd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771246947.807349-723-23843973326543/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 13:02:28 compute-1 sudo[72540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:28 compute-1 python3[72542]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 13:02:28 compute-1 sudo[72540]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:28 compute-1 sudo[72692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egsilhpomcryewjrnvxumiyosqmcagrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246948.7301962-741-128666542016797/AnsiballZ_stat.py'
Feb 16 13:02:28 compute-1 sudo[72692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:29 compute-1 python3.9[72694]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:29 compute-1 sudo[72692]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:29 compute-1 sudo[72815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvhncwzwgealaiwimbdcnosizcxfbplv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246948.7301962-741-128666542016797/AnsiballZ_copy.py'
Feb 16 13:02:29 compute-1 sudo[72815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:29 compute-1 python3.9[72817]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246948.7301962-741-128666542016797/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:29 compute-1 sudo[72815]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:30 compute-1 sudo[72967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsptakypiqmxlhxxxglgkfnirdfdzfph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246949.8589108-769-56553770050248/AnsiballZ_stat.py'
Feb 16 13:02:30 compute-1 sudo[72967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:30 compute-1 python3.9[72969]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:30 compute-1 sudo[72967]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:30 compute-1 sudo[73090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxtbtcwriojrlcqzhhhsaluwvqjmldjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246949.8589108-769-56553770050248/AnsiballZ_copy.py'
Feb 16 13:02:30 compute-1 sudo[73090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:31 compute-1 python3.9[73092]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246949.8589108-769-56553770050248/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:31 compute-1 sudo[73090]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:31 compute-1 sudo[73242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btvuuycsusbjwyoglbjzvsvlbvahvofo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246951.3049102-799-229994113306480/AnsiballZ_stat.py'
Feb 16 13:02:31 compute-1 sudo[73242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:31 compute-1 python3.9[73244]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:31 compute-1 sudo[73242]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:32 compute-1 sudo[73365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qunkobpxxtfwynfilyzsomlhveznfhza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246951.3049102-799-229994113306480/AnsiballZ_copy.py'
Feb 16 13:02:32 compute-1 sudo[73365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:32 compute-1 python3.9[73367]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246951.3049102-799-229994113306480/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:32 compute-1 sudo[73365]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:32 compute-1 sudo[73517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-askkaaiwrittndakjahfsfycwvxmhdcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246952.4826574-829-250364000207403/AnsiballZ_stat.py'
Feb 16 13:02:32 compute-1 sudo[73517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:32 compute-1 python3.9[73519]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:32 compute-1 sudo[73517]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:33 compute-1 sudo[73640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyxcyvcashxpcewnbfmwnibembpkiiio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246952.4826574-829-250364000207403/AnsiballZ_copy.py'
Feb 16 13:02:33 compute-1 sudo[73640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:33 compute-1 python3.9[73642]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246952.4826574-829-250364000207403/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:33 compute-1 sudo[73640]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:33 compute-1 sudo[73792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvctziagnevkkufnabzkcnkldjepcego ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246953.6888833-859-74072305067313/AnsiballZ_stat.py'
Feb 16 13:02:33 compute-1 sudo[73792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:34 compute-1 python3.9[73794]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:02:34 compute-1 sudo[73792]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:34 compute-1 sudo[73915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezeyyjjliplmfypycybpqyikgciporie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246953.6888833-859-74072305067313/AnsiballZ_copy.py'
Feb 16 13:02:34 compute-1 sudo[73915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:34 compute-1 python3.9[73917]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771246953.6888833-859-74072305067313/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:34 compute-1 sudo[73915]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:35 compute-1 sudo[74067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlcmcxhjsvypmxbotpnloyhdjatveyvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246955.1605678-889-277225808241144/AnsiballZ_file.py'
Feb 16 13:02:35 compute-1 sudo[74067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:35 compute-1 python3.9[74069]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:35 compute-1 sudo[74067]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:35 compute-1 sudo[74219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeuitoozmlryriizwhbubiigjjvlhmyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246955.7653468-905-56051032938451/AnsiballZ_command.py'
Feb 16 13:02:35 compute-1 sudo[74219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:36 compute-1 python3.9[74221]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:36 compute-1 sudo[74219]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:36 compute-1 sudo[74378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulsynpgiqgpttnjvlnagzxyctbcfgnnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246956.4884048-921-191220391730775/AnsiballZ_blockinfile.py'
Feb 16 13:02:36 compute-1 sudo[74378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:37 compute-1 python3.9[74380]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:37 compute-1 sudo[74378]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:37 compute-1 sudo[74531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncbhffopsaxzifwskggujaarbhwgmqql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246957.3325446-939-172186022167640/AnsiballZ_file.py'
Feb 16 13:02:37 compute-1 sudo[74531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:37 compute-1 python3.9[74533]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:37 compute-1 sudo[74531]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:38 compute-1 sudo[74683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahbzmlvzffktvasgdmqoaorbgnhvfodm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246957.8503826-939-122768195557151/AnsiballZ_file.py'
Feb 16 13:02:38 compute-1 sudo[74683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:38 compute-1 python3.9[74685]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:38 compute-1 sudo[74683]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:39 compute-1 sudo[74835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mowihjltjcgqwuxqseobtvnknlxocdhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246958.5500357-969-103763446377411/AnsiballZ_mount.py'
Feb 16 13:02:39 compute-1 sudo[74835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:39 compute-1 python3.9[74837]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 16 13:02:39 compute-1 sudo[74835]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:39 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:02:39 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:02:39 compute-1 sudo[74989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tudxuymnkjwqwutgqnuksusickhlqrhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246959.5489283-969-247071840496178/AnsiballZ_mount.py'
Feb 16 13:02:39 compute-1 sudo[74989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:39 compute-1 python3.9[74991]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 16 13:02:40 compute-1 sudo[74989]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:40 compute-1 sshd-session[65787]: Connection closed by 192.168.122.30 port 37648
Feb 16 13:02:40 compute-1 sshd-session[65784]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:02:40 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Feb 16 13:02:40 compute-1 systemd[1]: session-16.scope: Consumed 29.100s CPU time.
Feb 16 13:02:40 compute-1 systemd-logind[821]: Session 16 logged out. Waiting for processes to exit.
Feb 16 13:02:40 compute-1 systemd-logind[821]: Removed session 16.
Feb 16 13:02:46 compute-1 sshd-session[75018]: Accepted publickey for zuul from 192.168.122.30 port 35716 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:02:46 compute-1 systemd-logind[821]: New session 17 of user zuul.
Feb 16 13:02:46 compute-1 systemd[1]: Started Session 17 of User zuul.
Feb 16 13:02:46 compute-1 sshd-session[75018]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:02:47 compute-1 sudo[75171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbxxbvaqoxkrmgkyuamxxvjetdgoukqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246966.9885328-23-123739517023750/AnsiballZ_tempfile.py'
Feb 16 13:02:47 compute-1 sudo[75171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:47 compute-1 python3.9[75173]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 16 13:02:47 compute-1 sudo[75171]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:48 compute-1 sudo[75323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nskyzsdjobwjsktzxkaqoxzpseqrsmom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246967.838921-47-81657766539609/AnsiballZ_stat.py'
Feb 16 13:02:48 compute-1 sudo[75323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:48 compute-1 python3.9[75325]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:02:48 compute-1 sudo[75323]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:49 compute-1 sudo[75475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upmohmbswpnkiwpxuzaudnykdqwvwjtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246968.6669931-67-227453689765474/AnsiballZ_setup.py'
Feb 16 13:02:49 compute-1 sudo[75475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:49 compute-1 python3.9[75477]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:02:49 compute-1 sudo[75475]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:50 compute-1 sudo[75627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vporhciusjyervweavgoskylcnexkxba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246969.8222587-84-49134782297313/AnsiballZ_blockinfile.py'
Feb 16 13:02:50 compute-1 sudo[75627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:50 compute-1 python3.9[75629]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCZXltIir5JBlvlPO7xLUmiSTW31gnyk58yBpRPAz3e1zsBdp3tp69owh1HkTmYc2BM0dUw6H1M71VuxI0SapqM1d5LkcPPguX1Mq7TGAQn2dsX6Piigs5Cgp5OXbpdp5/nJMF2TC4TLMXXab89NyRA6uh3T423AM/mWQEIBu3i252ANCm921kcESMFJPNdnV1B74UKLptZ8BUaExyksvXJtesoOoU5tovgAd4TFk6u4EgLNEnb8afOk11FDJSnTBwtrYzIJNIgo8EgU+JaDlS1BWU88QSSGYyUwznbA09nebRk89Vy+XJ9DYHlXuPU2Iz50yOO6dFTk0sfOA3/GDlBF2Z+I3eusAQR53HhJ08/uLwfEXOwpQfqAHgmIKXGBIcOSswNRXBRDVy46MVK0bxxRtljm2BHlSo/ayqvw63HW9V555GjPAmhtovOPJeX2/GaUTA8/48ZVHcvu7bnAjxyJK6lDyp1kVm2Zv0x8bl1mFbuOBi39ZeK+zc5AADi7CM=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII4vGkqkehtuFud87VeuVaCZH32Y8wUj3DYUltABnLf7
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOElrSCFS57nxTxtJ648wl1DVoVcAkQzVPwikLAgiomC/pYBiXtlGQhPs9E4LrY5DDQLvhyYJ2yYdVE/4SRyISo=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+TVRcBupgzmTdjTAl4PntIbYWysOWMtXN0mftJlk/ePbSMt7K7+RZk++0YQpyUwPpty3/HFX3wQRF9viEaPSK5jMBRYoRn1WdvOeWGMCfK7TDigQW8ojVMgO804XHVvxEG717wW4/uLTu8DPqc5HNEqPzVl1GTH02Xj7g2O/FoQmpQuTeoquar0XxVxfiemUgZKGMCLaArrVK5u5oEXiiXWIGno1zlGwZ78bOq/csrxTZqVtXhSr8cszXUWFTqDh9bafxdl/Lj8NyfjG/pw7mTjSD+9KfpGTW4PCTru5Yp7Jr0AGSNqcWo76aGPWIuF5Ev6byLNM9NPjyT//iGN7Ez8x7GshAoUtHZ4BytRuL71hTYzRVU4t/21c5bLoo+aaeQ6RYB8U2VTh3L8gL6mB7oL+u+ZyVhDLvrlb4OK7yG6PpFBqlvXqK92lIC7x+tqrLYh28gfkawLOE2pnD8CzcuZfIc/aZ41WDNkE+0xSUlVPieVEvR+Og9hYKqxEJjjs=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHG+w/iJ+Sd31ZLka6ki6wkHvTMxAiuRNHy6U6ZlO/c7
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK6j3HO6Ca/k3kBzCq5boL0wpgaz3/9NCyn+y/MXv7x/dYitXgqAC8QrwaYe9xZNnaPdzPecAgq1NX9k2zO6yoI=
                                             create=True mode=0644 path=/tmp/ansible.z6uzgd85 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:50 compute-1 sudo[75627]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:51 compute-1 sudo[75779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zttyvevrisceybslbfkuhozzlratblfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246970.606318-100-270581447923957/AnsiballZ_command.py'
Feb 16 13:02:51 compute-1 sudo[75779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:51 compute-1 python3.9[75781]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.z6uzgd85' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:02:51 compute-1 sudo[75779]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:51 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 16 13:02:51 compute-1 sudo[75935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzflxleudmismdzzjogfueotebzdhabc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246971.5864768-116-51962785234329/AnsiballZ_file.py'
Feb 16 13:02:51 compute-1 sudo[75935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:02:52 compute-1 python3.9[75937]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.z6uzgd85 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:02:52 compute-1 sudo[75935]: pam_unix(sudo:session): session closed for user root
Feb 16 13:02:52 compute-1 sshd-session[75021]: Connection closed by 192.168.122.30 port 35716
Feb 16 13:02:52 compute-1 sshd-session[75018]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:02:52 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Feb 16 13:02:52 compute-1 systemd[1]: session-17.scope: Consumed 2.866s CPU time.
Feb 16 13:02:52 compute-1 systemd-logind[821]: Session 17 logged out. Waiting for processes to exit.
Feb 16 13:02:52 compute-1 systemd-logind[821]: Removed session 17.
Feb 16 13:02:58 compute-1 sshd-session[75964]: Accepted publickey for zuul from 192.168.122.30 port 44628 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:02:58 compute-1 systemd-logind[821]: New session 18 of user zuul.
Feb 16 13:02:58 compute-1 systemd[1]: Started Session 18 of User zuul.
Feb 16 13:02:58 compute-1 sshd-session[75964]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:02:58 compute-1 sshd-session[75962]: Connection closed by authenticating user root 146.190.226.24 port 45744 [preauth]
Feb 16 13:02:59 compute-1 python3.9[76117]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:03:00 compute-1 sudo[76271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsnuygqvoijwvkccgmskcomdldsgqlar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246979.5314639-45-87432300861385/AnsiballZ_systemd.py'
Feb 16 13:03:00 compute-1 sudo[76271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:00 compute-1 python3.9[76273]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 16 13:03:00 compute-1 sudo[76271]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:00 compute-1 sudo[76425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htiscutszbwvnkbcffldwyugybqxnfie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246980.7054205-61-239208971238609/AnsiballZ_systemd.py'
Feb 16 13:03:00 compute-1 sudo[76425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:01 compute-1 python3.9[76427]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:03:01 compute-1 sudo[76425]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:01 compute-1 sudo[76578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiegyyhnmxyiezworaztmaxkpfsytjta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246981.5749753-79-11997566786804/AnsiballZ_command.py'
Feb 16 13:03:01 compute-1 sudo[76578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:02 compute-1 python3.9[76580]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:03:02 compute-1 sudo[76578]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:02 compute-1 sudo[76731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uapssjygubvxsvsfqsiasalnfelksdaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246982.327617-95-60470123072255/AnsiballZ_stat.py'
Feb 16 13:03:02 compute-1 sudo[76731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:02 compute-1 python3.9[76733]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:03:02 compute-1 sudo[76731]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:03 compute-1 sudo[76885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpnmqtjvywrupokthzapvrcbmelblwfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246983.0848808-111-53652030778605/AnsiballZ_command.py'
Feb 16 13:03:03 compute-1 sudo[76885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:03 compute-1 python3.9[76887]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:03:03 compute-1 sudo[76885]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:04 compute-1 sudo[77040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qztunmdccgdnlwahsmpethovzatfmdwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246983.7486718-127-132050304252530/AnsiballZ_file.py'
Feb 16 13:03:04 compute-1 sudo[77040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:04 compute-1 python3.9[77042]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:04 compute-1 sudo[77040]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:05 compute-1 sshd-session[75967]: Connection closed by 192.168.122.30 port 44628
Feb 16 13:03:05 compute-1 sshd-session[75964]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:03:05 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Feb 16 13:03:05 compute-1 systemd[1]: session-18.scope: Consumed 3.749s CPU time.
Feb 16 13:03:05 compute-1 systemd-logind[821]: Session 18 logged out. Waiting for processes to exit.
Feb 16 13:03:05 compute-1 systemd-logind[821]: Removed session 18.
Feb 16 13:03:10 compute-1 sshd-session[77068]: Accepted publickey for zuul from 192.168.122.30 port 59678 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:03:10 compute-1 systemd-logind[821]: New session 19 of user zuul.
Feb 16 13:03:10 compute-1 systemd[1]: Started Session 19 of User zuul.
Feb 16 13:03:10 compute-1 sshd-session[77068]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:03:11 compute-1 python3.9[77221]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:03:12 compute-1 sudo[77375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ospumdagqdkbburcxzntdbbxtovuuzgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246992.4374008-49-199176813975138/AnsiballZ_setup.py'
Feb 16 13:03:12 compute-1 sudo[77375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:12 compute-1 python3.9[77377]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:03:13 compute-1 sudo[77375]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:13 compute-1 sudo[77459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uezmlaexvfarievjygvghvrovilautdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771246992.4374008-49-199176813975138/AnsiballZ_dnf.py'
Feb 16 13:03:13 compute-1 sudo[77459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:13 compute-1 python3.9[77461]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 13:03:15 compute-1 sudo[77459]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:16 compute-1 python3.9[77612]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:03:17 compute-1 python3.9[77763]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:03:18 compute-1 python3.9[77913]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:03:19 compute-1 python3.9[78063]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:03:20 compute-1 sshd-session[77071]: Connection closed by 192.168.122.30 port 59678
Feb 16 13:03:20 compute-1 sshd-session[77068]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:03:20 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Feb 16 13:03:20 compute-1 systemd[1]: session-19.scope: Consumed 5.278s CPU time.
Feb 16 13:03:20 compute-1 systemd-logind[821]: Session 19 logged out. Waiting for processes to exit.
Feb 16 13:03:20 compute-1 systemd-logind[821]: Removed session 19.
Feb 16 13:03:25 compute-1 sshd-session[78088]: Accepted publickey for zuul from 192.168.122.30 port 38836 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:03:25 compute-1 systemd-logind[821]: New session 20 of user zuul.
Feb 16 13:03:25 compute-1 systemd[1]: Started Session 20 of User zuul.
Feb 16 13:03:25 compute-1 sshd-session[78088]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:03:26 compute-1 python3.9[78241]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:03:28 compute-1 sudo[78395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtdikdezlexobpsrickssnfnqzfjeufl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247008.0435538-80-135874705070742/AnsiballZ_file.py'
Feb 16 13:03:28 compute-1 sudo[78395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:28 compute-1 python3.9[78397]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:28 compute-1 sudo[78395]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:29 compute-1 sudo[78547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoxecybpzvohrihjbuisqukagyjryink ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247008.7876346-80-34616943731792/AnsiballZ_file.py'
Feb 16 13:03:29 compute-1 sudo[78547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:29 compute-1 python3.9[78549]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:29 compute-1 sudo[78547]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:29 compute-1 sudo[78699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztcqcagmzxzqhcexgemqykkeycomfmew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247009.4363878-111-143256314267000/AnsiballZ_stat.py'
Feb 16 13:03:29 compute-1 sudo[78699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:30 compute-1 python3.9[78701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:30 compute-1 sudo[78699]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:30 compute-1 sudo[78822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yevmumiobjqadjfbexzfzoitnkttjite ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247009.4363878-111-143256314267000/AnsiballZ_copy.py'
Feb 16 13:03:30 compute-1 sudo[78822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:30 compute-1 python3.9[78824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247009.4363878-111-143256314267000/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=aeda5ed39751d0bc3b792c20dfea96cfa4384979 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:30 compute-1 sudo[78822]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:31 compute-1 sudo[78974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsxpukzanjaezubhrnjxfdqqgvigobvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247010.773491-111-134055269881394/AnsiballZ_stat.py'
Feb 16 13:03:31 compute-1 sudo[78974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:31 compute-1 python3.9[78976]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:31 compute-1 sudo[78974]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:31 compute-1 sudo[79097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iavmhmixoccqnqxbmimrfjhyztsydibf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247010.773491-111-134055269881394/AnsiballZ_copy.py'
Feb 16 13:03:31 compute-1 sudo[79097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:31 compute-1 python3.9[79099]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247010.773491-111-134055269881394/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=c9dfcac50e41db328d94f1391b1aaddaecead554 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:31 compute-1 sudo[79097]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:32 compute-1 sudo[79249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdhaeizcrubotrmtgxyymxxudfqypiqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247011.9038937-111-159626547459723/AnsiballZ_stat.py'
Feb 16 13:03:32 compute-1 sudo[79249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:32 compute-1 python3.9[79251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:32 compute-1 sudo[79249]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:32 compute-1 sudo[79372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oifxgakkomzlrhjdfhwhbdztbyfezoiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247011.9038937-111-159626547459723/AnsiballZ_copy.py'
Feb 16 13:03:32 compute-1 sudo[79372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:32 compute-1 python3.9[79374]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247011.9038937-111-159626547459723/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=70996ce4c13567315c50a1baf7ecc94f5d0937ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:32 compute-1 sudo[79372]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:33 compute-1 sudo[79524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kapthpacuzbqobsjznylwgpljswaeacl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247013.0323339-196-14239669101629/AnsiballZ_file.py'
Feb 16 13:03:33 compute-1 sudo[79524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:33 compute-1 python3.9[79526]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:33 compute-1 sudo[79524]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:33 compute-1 sudo[79676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anfqjuctiipvlwsguldejrwzismfvunk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247013.6335459-196-4164595836194/AnsiballZ_file.py'
Feb 16 13:03:33 compute-1 sudo[79676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:34 compute-1 python3.9[79678]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:34 compute-1 sudo[79676]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:34 compute-1 sudo[79828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgyiggwbnnquudmuezhjiznkofisoadu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247014.2312794-227-188617728770390/AnsiballZ_stat.py'
Feb 16 13:03:34 compute-1 sudo[79828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:34 compute-1 python3.9[79830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:34 compute-1 sudo[79828]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:34 compute-1 sudo[79951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nngxuhudqoixdfaxtiysttycyauihbcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247014.2312794-227-188617728770390/AnsiballZ_copy.py'
Feb 16 13:03:34 compute-1 sudo[79951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:35 compute-1 python3.9[79953]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247014.2312794-227-188617728770390/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=52dff5b6c395a7c209993fc51499b4a718f01d46 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:35 compute-1 sudo[79951]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:35 compute-1 sudo[80103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heajlosvqdsvyroemhhvfbozlrjgekkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247015.4836001-227-133303876026181/AnsiballZ_stat.py'
Feb 16 13:03:35 compute-1 sudo[80103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:35 compute-1 python3.9[80105]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:35 compute-1 sudo[80103]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:36 compute-1 sudo[80226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olzysbdajpsfsqkeqizhuchlebvaxwpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247015.4836001-227-133303876026181/AnsiballZ_copy.py'
Feb 16 13:03:36 compute-1 sudo[80226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:36 compute-1 python3.9[80228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247015.4836001-227-133303876026181/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=c95f61194c2ceee3c16fe7cffc94cfc98ee9379b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:36 compute-1 sudo[80226]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:36 compute-1 sudo[80378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kshtqmsvokgamuyzesbhcurmupovdrze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247016.641061-227-272491441326538/AnsiballZ_stat.py'
Feb 16 13:03:36 compute-1 sudo[80378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:37 compute-1 python3.9[80380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:37 compute-1 sudo[80378]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:37 compute-1 sudo[80501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlnmshkygqtbhcmhpvvfbqzbmocxhfpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247016.641061-227-272491441326538/AnsiballZ_copy.py'
Feb 16 13:03:37 compute-1 sudo[80501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:37 compute-1 python3.9[80503]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247016.641061-227-272491441326538/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=2be8bff8905f803206923da54d33f1be96b1fc1b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:37 compute-1 sudo[80501]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:38 compute-1 sudo[80653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lduizfzojwihwppvtljewqlucqjwzyaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247017.8085775-313-93216906240736/AnsiballZ_file.py'
Feb 16 13:03:38 compute-1 sudo[80653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:38 compute-1 python3.9[80655]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:38 compute-1 sudo[80653]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:38 compute-1 sudo[80805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byladbpozauwdxejpzdxwtetoqjqgkrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247018.3980293-313-164031064901790/AnsiballZ_file.py'
Feb 16 13:03:38 compute-1 sudo[80805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:38 compute-1 python3.9[80807]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:38 compute-1 sudo[80805]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:39 compute-1 sudo[80957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdsxkizywxpembayyzqizseugjvxkbmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247018.988305-343-255622330515465/AnsiballZ_stat.py'
Feb 16 13:03:39 compute-1 sudo[80957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:39 compute-1 python3.9[80959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:39 compute-1 sudo[80957]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:39 compute-1 sudo[81080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrvodlxnmqonslgkzbczieejumjcynkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247018.988305-343-255622330515465/AnsiballZ_copy.py'
Feb 16 13:03:39 compute-1 sudo[81080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:40 compute-1 python3.9[81082]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247018.988305-343-255622330515465/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=2d511eca01c2212cf12340ad59765d3ca0de21a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:40 compute-1 sudo[81080]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:40 compute-1 sudo[81232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skgwebczgfqqgyscjdpcjpxbpyrucprg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247020.1870642-343-19066836387530/AnsiballZ_stat.py'
Feb 16 13:03:40 compute-1 sudo[81232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:40 compute-1 python3.9[81234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:40 compute-1 sudo[81232]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:40 compute-1 sudo[81355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obtjppoqpuoqdrqxnmskalkoexrepuxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247020.1870642-343-19066836387530/AnsiballZ_copy.py'
Feb 16 13:03:40 compute-1 sudo[81355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:41 compute-1 python3.9[81357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247020.1870642-343-19066836387530/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=35df7d65799c53f8c7036eda87c4670debdbe292 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:41 compute-1 sudo[81355]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:41 compute-1 sudo[81507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vslfnfgxctxidjqpmfnkcseqqdfmspmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247021.176805-343-112036285231988/AnsiballZ_stat.py'
Feb 16 13:03:41 compute-1 sudo[81507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:41 compute-1 python3.9[81509]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:41 compute-1 sudo[81507]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:41 compute-1 sudo[81630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrsprespgaksmsaqhjpbpxikimqjwzpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247021.176805-343-112036285231988/AnsiballZ_copy.py'
Feb 16 13:03:41 compute-1 sudo[81630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:42 compute-1 python3.9[81632]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247021.176805-343-112036285231988/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=a08051bd545151a6d9774535e08a3634cf46f7bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:42 compute-1 sudo[81630]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:42 compute-1 sudo[81782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsaqrzbhqtajyaesykrydgdanwpiwrjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247022.33688-428-185776700506375/AnsiballZ_file.py'
Feb 16 13:03:42 compute-1 sudo[81782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:42 compute-1 python3.9[81784]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:42 compute-1 sudo[81782]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:43 compute-1 sudo[81934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaziugjgajwlwghfxsckkuimzrzkpaag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247022.919577-428-186214755562308/AnsiballZ_file.py'
Feb 16 13:03:43 compute-1 sudo[81934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:43 compute-1 python3.9[81936]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:43 compute-1 sudo[81934]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:43 compute-1 sudo[82086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugocdozigjcrnbwqrfngcfvobsqksyqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247023.5222192-457-212431508909364/AnsiballZ_stat.py'
Feb 16 13:03:43 compute-1 sudo[82086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:43 compute-1 python3.9[82088]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:43 compute-1 sudo[82086]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:44 compute-1 sudo[82209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ferpiozvrazjvryvjbobagxzgertingl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247023.5222192-457-212431508909364/AnsiballZ_copy.py'
Feb 16 13:03:44 compute-1 sudo[82209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:44 compute-1 python3.9[82211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247023.5222192-457-212431508909364/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=81960df864aa0f56b70b5fd323cc205dd85f060a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:44 compute-1 sudo[82209]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:44 compute-1 sudo[82361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoxmrglbxwcryxvrliogvfrnllsryghp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247024.592601-457-278984867784574/AnsiballZ_stat.py'
Feb 16 13:03:44 compute-1 sudo[82361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:44 compute-1 python3.9[82363]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:44 compute-1 sudo[82361]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:45 compute-1 sudo[82484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmfnqzryspxfjgalrgjcjfhopkzskmmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247024.592601-457-278984867784574/AnsiballZ_copy.py'
Feb 16 13:03:45 compute-1 sudo[82484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:45 compute-1 python3.9[82486]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247024.592601-457-278984867784574/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=35df7d65799c53f8c7036eda87c4670debdbe292 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:45 compute-1 sudo[82484]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:45 compute-1 sudo[82636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irsrcvblylmhvjofjtidupkbraejhdww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247025.5608022-457-108865242097967/AnsiballZ_stat.py'
Feb 16 13:03:45 compute-1 sudo[82636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:45 compute-1 python3.9[82638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:45 compute-1 sudo[82636]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:46 compute-1 sudo[82759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovappazmvstcitidkkzohnkqcyqtmokj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247025.5608022-457-108865242097967/AnsiballZ_copy.py'
Feb 16 13:03:46 compute-1 sudo[82759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:46 compute-1 python3.9[82761]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247025.5608022-457-108865242097967/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=fe38e22e50930197307412eb15fdca0aee13773d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:46 compute-1 sudo[82759]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:47 compute-1 sudo[82911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtbrfzyqtacjxxmlqupanbitrhlxzqww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247027.0870042-560-254494908693669/AnsiballZ_file.py'
Feb 16 13:03:47 compute-1 sudo[82911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:47 compute-1 python3.9[82913]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:47 compute-1 sudo[82911]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:47 compute-1 sudo[83063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbhsascqxgsnkxghnptguigdrbbvvtmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247027.6619143-575-1158673656944/AnsiballZ_stat.py'
Feb 16 13:03:47 compute-1 sudo[83063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:48 compute-1 python3.9[83065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:48 compute-1 sudo[83063]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:48 compute-1 sudo[83186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjtfqbftvxdxbzeegzaxbazavipnowpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247027.6619143-575-1158673656944/AnsiballZ_copy.py'
Feb 16 13:03:48 compute-1 sudo[83186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:48 compute-1 python3.9[83188]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247027.6619143-575-1158673656944/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:48 compute-1 sudo[83186]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:48 compute-1 sudo[83338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trhtziiholdktvjiadmouywzeievauqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247028.7399669-615-162369319785312/AnsiballZ_file.py'
Feb 16 13:03:48 compute-1 sudo[83338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:49 compute-1 python3.9[83340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:49 compute-1 sudo[83338]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:49 compute-1 sudo[83490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmumefcrlvlihgumlwdynpjralgbeuxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247029.3848684-629-129465736573816/AnsiballZ_stat.py'
Feb 16 13:03:49 compute-1 sudo[83490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:49 compute-1 python3.9[83492]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:49 compute-1 sudo[83490]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:50 compute-1 sudo[83613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upgsaitxrhtbmvwaislnxirpixnnuzbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247029.3848684-629-129465736573816/AnsiballZ_copy.py'
Feb 16 13:03:50 compute-1 sudo[83613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:50 compute-1 python3.9[83615]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247029.3848684-629-129465736573816/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:50 compute-1 sudo[83613]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:50 compute-1 sudo[83765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwggsuwkpjcpnkvweejbooqwbsuzovqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247030.4853911-657-41513472943144/AnsiballZ_file.py'
Feb 16 13:03:50 compute-1 sudo[83765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:50 compute-1 python3.9[83767]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:50 compute-1 sudo[83765]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:51 compute-1 sudo[83917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txphskscmyesgugclnktvmhvtvbdbeud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247031.0001009-671-232952080061176/AnsiballZ_stat.py'
Feb 16 13:03:51 compute-1 sudo[83917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:51 compute-1 python3.9[83919]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:51 compute-1 sudo[83917]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:51 compute-1 sudo[84040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpbtgowpzxvhjanxurwkgwemwwqudbwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247031.0001009-671-232952080061176/AnsiballZ_copy.py'
Feb 16 13:03:51 compute-1 sudo[84040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:51 compute-1 python3.9[84042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247031.0001009-671-232952080061176/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:51 compute-1 sudo[84040]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:52 compute-1 sudo[84192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvntzimrruzrbcuzuohhgaymbselwevo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247032.1217353-703-153217171062113/AnsiballZ_file.py'
Feb 16 13:03:52 compute-1 sudo[84192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:52 compute-1 python3.9[84194]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:52 compute-1 sudo[84192]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:53 compute-1 sudo[84344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdwrkvgyflpdfrqvuyjhusvoivnktkxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247032.7358768-720-16169279440452/AnsiballZ_stat.py'
Feb 16 13:03:53 compute-1 sudo[84344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:53 compute-1 python3.9[84346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:53 compute-1 sudo[84344]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:53 compute-1 sudo[84467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shiscreagdqslzdlweiiygarnpnmpxbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247032.7358768-720-16169279440452/AnsiballZ_copy.py'
Feb 16 13:03:53 compute-1 sudo[84467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:53 compute-1 python3.9[84469]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247032.7358768-720-16169279440452/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:53 compute-1 sudo[84467]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:54 compute-1 sudo[84619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfznlyhwpiinfyzdfpdhjnefxmvhujop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247034.088529-752-223104732077808/AnsiballZ_file.py'
Feb 16 13:03:54 compute-1 sudo[84619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:54 compute-1 python3.9[84621]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:54 compute-1 sudo[84619]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:54 compute-1 sudo[84771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhrmisutdmhumnluufjrvbovkowriqtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247034.6725962-768-18459348894585/AnsiballZ_stat.py'
Feb 16 13:03:54 compute-1 sudo[84771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:55 compute-1 python3.9[84773]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:55 compute-1 sudo[84771]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:55 compute-1 sudo[84894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmmsrfmudrtnnqxjtzzavocaxloxchjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247034.6725962-768-18459348894585/AnsiballZ_copy.py'
Feb 16 13:03:55 compute-1 sudo[84894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:55 compute-1 python3.9[84896]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247034.6725962-768-18459348894585/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:55 compute-1 sudo[84894]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:56 compute-1 sudo[85046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmrdrctadzunsygzplgpxndbzgjplfbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247035.7898061-798-221196038842804/AnsiballZ_file.py'
Feb 16 13:03:56 compute-1 sudo[85046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:56 compute-1 chronyd[65758]: Selected source 216.232.132.19 (pool.ntp.org)
Feb 16 13:03:56 compute-1 python3.9[85048]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:56 compute-1 sudo[85046]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:56 compute-1 sudo[85198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yksmoqtuxgmkkikmsodmweotdnglqrxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247036.3913753-815-71240356181184/AnsiballZ_stat.py'
Feb 16 13:03:56 compute-1 sudo[85198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:56 compute-1 python3.9[85200]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:56 compute-1 sudo[85198]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:57 compute-1 sudo[85321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-turjraagbrnknrfgxuxkaxsfhpjuvmvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247036.3913753-815-71240356181184/AnsiballZ_copy.py'
Feb 16 13:03:57 compute-1 sudo[85321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:57 compute-1 python3.9[85323]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247036.3913753-815-71240356181184/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:57 compute-1 sudo[85321]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:57 compute-1 sudo[85473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbfjijswfxushtcbiwtqlvnlqqkpjbfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247037.5486681-844-238626958612483/AnsiballZ_file.py'
Feb 16 13:03:57 compute-1 sudo[85473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:58 compute-1 python3.9[85475]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:03:58 compute-1 sudo[85473]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:58 compute-1 sudo[85625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glhwxltxchrnrtqflyrgsdqmfeayfpir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247038.191461-859-151687410128040/AnsiballZ_stat.py'
Feb 16 13:03:58 compute-1 sudo[85625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:58 compute-1 python3.9[85627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:03:58 compute-1 sudo[85625]: pam_unix(sudo:session): session closed for user root
Feb 16 13:03:58 compute-1 sudo[85748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irdodchzxznibkreehhgmuwrzarnpzom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247038.191461-859-151687410128040/AnsiballZ_copy.py'
Feb 16 13:03:58 compute-1 sudo[85748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:03:59 compute-1 python3.9[85750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247038.191461-859-151687410128040/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=01d1f535123be2e7a115b69213a7cb6af06b70ab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:03:59 compute-1 sudo[85748]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:02 compute-1 sshd-session[78091]: Connection closed by 192.168.122.30 port 38836
Feb 16 13:04:02 compute-1 sshd-session[78088]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:04:02 compute-1 systemd-logind[821]: Session 20 logged out. Waiting for processes to exit.
Feb 16 13:04:02 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Feb 16 13:04:02 compute-1 systemd[1]: session-20.scope: Consumed 23.644s CPU time.
Feb 16 13:04:02 compute-1 systemd-logind[821]: Removed session 20.
Feb 16 13:04:02 compute-1 sshd-session[85775]: Connection closed by authenticating user root 146.190.226.24 port 42740 [preauth]
Feb 16 13:04:07 compute-1 sshd-session[85777]: Accepted publickey for zuul from 192.168.122.30 port 47194 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:04:07 compute-1 systemd-logind[821]: New session 21 of user zuul.
Feb 16 13:04:07 compute-1 systemd[1]: Started Session 21 of User zuul.
Feb 16 13:04:07 compute-1 sshd-session[85777]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:04:08 compute-1 python3.9[85930]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:04:09 compute-1 sudo[86084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfyuwywpyslzrngwdooivqroovwfubfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247049.4070168-49-80248152065618/AnsiballZ_file.py'
Feb 16 13:04:09 compute-1 sudo[86084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:10 compute-1 python3.9[86086]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:10 compute-1 sudo[86084]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:10 compute-1 sudo[86236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oautavzcjcusgzdlgmwzcuzoxscbuqmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247050.1698704-49-53728647426097/AnsiballZ_file.py'
Feb 16 13:04:10 compute-1 sudo[86236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:10 compute-1 python3.9[86238]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:10 compute-1 sudo[86236]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:11 compute-1 python3.9[86388]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:04:11 compute-1 sudo[86538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adqftwfmwqlgxfrnwvnwcvuefbibknqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247051.4493692-95-222423988643000/AnsiballZ_seboolean.py'
Feb 16 13:04:11 compute-1 sudo[86538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:12 compute-1 python3.9[86540]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 16 13:04:12 compute-1 sudo[86538]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:13 compute-1 sudo[86694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytondcgkcftzsqaygjzsigyjzkpqqvgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247053.320211-115-232730752185661/AnsiballZ_setup.py'
Feb 16 13:04:13 compute-1 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 16 13:04:13 compute-1 sudo[86694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:13 compute-1 python3.9[86696]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:04:14 compute-1 sudo[86694]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:14 compute-1 sudo[86778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndfaxnqwfmcrjblsfwgacyskahvvjtju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247053.320211-115-232730752185661/AnsiballZ_dnf.py'
Feb 16 13:04:14 compute-1 sudo[86778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:14 compute-1 python3.9[86780]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:04:16 compute-1 sudo[86778]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:16 compute-1 sudo[86931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scnuurkdpaszckrznicvqznlushogrke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247056.1978076-139-27016538458225/AnsiballZ_systemd.py'
Feb 16 13:04:16 compute-1 sudo[86931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:17 compute-1 python3.9[86933]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:04:17 compute-1 sudo[86931]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:17 compute-1 sudo[87086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftwvgwxajkdfzajckvmedgbrsjbexnxn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247057.2121859-155-203407801778936/AnsiballZ_edpm_nftables_snippet.py'
Feb 16 13:04:17 compute-1 sudo[87086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:17 compute-1 python3[87088]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 16 13:04:17 compute-1 sudo[87086]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:18 compute-1 sudo[87238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whzwsdliarmaaeopfrskxqsjcrgttczo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247058.1026864-173-165497084715267/AnsiballZ_file.py'
Feb 16 13:04:18 compute-1 sudo[87238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:18 compute-1 python3.9[87240]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:18 compute-1 sudo[87238]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:19 compute-1 sudo[87390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewmskdkvalozyucmxetntopvjfpdmdev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247058.7529345-189-243498195058078/AnsiballZ_stat.py'
Feb 16 13:04:19 compute-1 sudo[87390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:19 compute-1 python3.9[87392]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:19 compute-1 sudo[87390]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:19 compute-1 sudo[87468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnyaqckkrbgwsqnfipihozyhcaufyuse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247058.7529345-189-243498195058078/AnsiballZ_file.py'
Feb 16 13:04:19 compute-1 sudo[87468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:20 compute-1 python3.9[87470]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:20 compute-1 sudo[87468]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:20 compute-1 sudo[87620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfnzrxboudwlzbzdjqfxngfnerplbzdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247060.293012-213-225806663355566/AnsiballZ_stat.py'
Feb 16 13:04:20 compute-1 sudo[87620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:20 compute-1 python3.9[87622]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:20 compute-1 sudo[87620]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:20 compute-1 sudo[87698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgwjroazoidetesxxbwnelmmvsenqpmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247060.293012-213-225806663355566/AnsiballZ_file.py'
Feb 16 13:04:20 compute-1 sudo[87698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:21 compute-1 python3.9[87700]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6lhc5n0y recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:21 compute-1 sudo[87698]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:21 compute-1 sudo[87850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mngqnbpaymiladeecizgpybymkwyinmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247061.3395362-237-51201657061761/AnsiballZ_stat.py'
Feb 16 13:04:21 compute-1 sudo[87850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:21 compute-1 python3.9[87852]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:21 compute-1 sudo[87850]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:22 compute-1 sudo[87928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfnrazgkrmddcmrniyyvztiiprlmtnvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247061.3395362-237-51201657061761/AnsiballZ_file.py'
Feb 16 13:04:22 compute-1 sudo[87928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:22 compute-1 python3.9[87930]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:22 compute-1 sudo[87928]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:22 compute-1 sudo[88080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxgqizhtozxqdebwhqblmhxsbnhexdgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247062.45217-264-94158794251212/AnsiballZ_command.py'
Feb 16 13:04:22 compute-1 sudo[88080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:23 compute-1 python3.9[88082]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:23 compute-1 sudo[88080]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:23 compute-1 sudo[88233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwbirpinhjcqltymmorswcmxjgkuxlmh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247063.2663116-279-94474805116258/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 13:04:23 compute-1 sudo[88233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:23 compute-1 python3[88235]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 13:04:23 compute-1 sudo[88233]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:24 compute-1 sudo[88385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvsggieglogykjhqwxhrlbkvviziynvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247064.0641208-295-43912691439185/AnsiballZ_stat.py'
Feb 16 13:04:24 compute-1 sudo[88385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:24 compute-1 python3.9[88387]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:24 compute-1 sudo[88385]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:24 compute-1 sudo[88510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbyftzwvjgdnvqrorakhgwupwbpvexto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247064.0641208-295-43912691439185/AnsiballZ_copy.py'
Feb 16 13:04:24 compute-1 sudo[88510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:25 compute-1 python3.9[88512]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247064.0641208-295-43912691439185/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:25 compute-1 sudo[88510]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:25 compute-1 sudo[88662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmdncjkwpywdqtcksgkgpbdsdwhfgoqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247065.3358595-325-244367308422280/AnsiballZ_stat.py'
Feb 16 13:04:25 compute-1 sudo[88662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:25 compute-1 python3.9[88664]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:25 compute-1 sudo[88662]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:26 compute-1 sudo[88787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psjtzijkobwgxqmppjglnuyvnnercexn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247065.3358595-325-244367308422280/AnsiballZ_copy.py'
Feb 16 13:04:26 compute-1 sudo[88787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:26 compute-1 python3.9[88789]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247065.3358595-325-244367308422280/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:26 compute-1 sudo[88787]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:26 compute-1 sudo[88939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeyudiabqkhkxlobrmlcqxrzchgnbjhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247066.4809663-355-14797047665569/AnsiballZ_stat.py'
Feb 16 13:04:26 compute-1 sudo[88939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:26 compute-1 python3.9[88941]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:26 compute-1 sudo[88939]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:27 compute-1 sudo[89064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqvxyzzfdkdjnjlpyyeiffssebcpzmug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247066.4809663-355-14797047665569/AnsiballZ_copy.py'
Feb 16 13:04:27 compute-1 sudo[89064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:27 compute-1 python3.9[89066]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247066.4809663-355-14797047665569/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:27 compute-1 sudo[89064]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:27 compute-1 sudo[89216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnodouigzdrhtfturawpstrsttoewfdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247067.656239-385-71079223055826/AnsiballZ_stat.py'
Feb 16 13:04:27 compute-1 sudo[89216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:28 compute-1 python3.9[89218]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:28 compute-1 sudo[89216]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:28 compute-1 sudo[89341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmdqhtioiglhpymakjvsfangqeybqpla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247067.656239-385-71079223055826/AnsiballZ_copy.py'
Feb 16 13:04:28 compute-1 sudo[89341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:28 compute-1 python3.9[89343]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247067.656239-385-71079223055826/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:28 compute-1 sudo[89341]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:29 compute-1 sudo[89493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkeqtianlazrtbpgvhhxtcldwhheepge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247068.8127425-415-255880233597735/AnsiballZ_stat.py'
Feb 16 13:04:29 compute-1 sudo[89493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:29 compute-1 python3.9[89495]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:29 compute-1 sudo[89493]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:29 compute-1 sudo[89618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etudpquennzvmxyshtfaqtcgllmfrupt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247068.8127425-415-255880233597735/AnsiballZ_copy.py'
Feb 16 13:04:29 compute-1 sudo[89618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:29 compute-1 python3.9[89620]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247068.8127425-415-255880233597735/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:29 compute-1 sudo[89618]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:30 compute-1 sudo[89770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjakavjidzgwbspadbbhgisqmefhegxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247070.1347637-445-108274671790915/AnsiballZ_file.py'
Feb 16 13:04:30 compute-1 sudo[89770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:30 compute-1 python3.9[89772]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:30 compute-1 sudo[89770]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:31 compute-1 sudo[89922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpomydrcgxkubyqxhsyfgqvxvlhqfote ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247070.7592266-461-80519541690316/AnsiballZ_command.py'
Feb 16 13:04:31 compute-1 sudo[89922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:31 compute-1 python3.9[89924]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:31 compute-1 sudo[89922]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:31 compute-1 sudo[90077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlkjjumeazgzchbgrkvffduvgqyjxdaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247071.4086342-477-206264435036859/AnsiballZ_blockinfile.py'
Feb 16 13:04:31 compute-1 sudo[90077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:31 compute-1 python3.9[90079]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:31 compute-1 sudo[90077]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:32 compute-1 sudo[90229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pewlgrldlphlacacindvzwiehduothtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247072.2194242-495-279447435103999/AnsiballZ_command.py'
Feb 16 13:04:32 compute-1 sudo[90229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:32 compute-1 python3.9[90231]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:32 compute-1 sudo[90229]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:33 compute-1 sudo[90382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtemafqbansbhlhujznyazciodfanxtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247072.913786-511-121503977713043/AnsiballZ_stat.py'
Feb 16 13:04:33 compute-1 sudo[90382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:33 compute-1 python3.9[90384]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:04:33 compute-1 sudo[90382]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:33 compute-1 sudo[90536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyndnwrojsbgietndokhdhtzrvxhnnkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247073.5899026-527-173713058047932/AnsiballZ_command.py'
Feb 16 13:04:33 compute-1 sudo[90536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:34 compute-1 python3.9[90538]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:34 compute-1 sudo[90536]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:34 compute-1 sudo[90691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kouomvmzwksukxjavoqsabhnjrigwbqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247074.1986516-543-133461036447911/AnsiballZ_file.py'
Feb 16 13:04:34 compute-1 sudo[90691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:34 compute-1 python3.9[90693]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:34 compute-1 sudo[90691]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:35 compute-1 python3.9[90843]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:04:36 compute-1 sudo[90994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjlskjoszahdcpxytohsvhhiprbcbmtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247076.4763367-623-128907369634950/AnsiballZ_command.py'
Feb 16 13:04:36 compute-1 sudo[90994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:36 compute-1 python3.9[90996]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:9f:1d:bd:e8" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:36 compute-1 ovs-vsctl[90997]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:9f:1d:bd:e8 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 16 13:04:36 compute-1 sudo[90994]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:37 compute-1 sudo[91147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcupzyysqwqevjtwbjwpyjsnmtkiovrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247077.1196046-641-76521429853974/AnsiballZ_command.py'
Feb 16 13:04:37 compute-1 sudo[91147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:37 compute-1 python3.9[91149]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:37 compute-1 sudo[91147]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:37 compute-1 sudo[91302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hragvmxkvufqemuxhvedsxejcxeldiih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247077.7634077-657-109983506921860/AnsiballZ_command.py'
Feb 16 13:04:37 compute-1 sudo[91302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:38 compute-1 python3.9[91304]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:04:38 compute-1 ovs-vsctl[91305]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 16 13:04:38 compute-1 sudo[91302]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:38 compute-1 python3.9[91455]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:04:39 compute-1 sudo[91607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egtgnedybhhhboozbxyavzihmadhxusb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247079.1383455-691-119543954014790/AnsiballZ_file.py'
Feb 16 13:04:39 compute-1 sudo[91607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:39 compute-1 python3.9[91609]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:39 compute-1 sudo[91607]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:40 compute-1 sudo[91759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oypemopawzqwvllcztplpkdifxriyimg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247079.9938748-707-143981439122137/AnsiballZ_stat.py'
Feb 16 13:04:40 compute-1 sudo[91759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:40 compute-1 python3.9[91761]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:40 compute-1 sudo[91759]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:40 compute-1 sudo[91837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aerzqsqgdgczuokxqxqyjaaixwpnbxvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247079.9938748-707-143981439122137/AnsiballZ_file.py'
Feb 16 13:04:40 compute-1 sudo[91837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:40 compute-1 python3.9[91839]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:40 compute-1 sudo[91837]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:41 compute-1 sudo[91989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lehovdmjccsenuzwxcjmpqmvfmwmxktm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247080.9484756-707-9112036038269/AnsiballZ_stat.py'
Feb 16 13:04:41 compute-1 sudo[91989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:41 compute-1 python3.9[91991]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:41 compute-1 sudo[91989]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:41 compute-1 sudo[92067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sciyouzupksffvwvzbpnqtsxgedbssab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247080.9484756-707-9112036038269/AnsiballZ_file.py'
Feb 16 13:04:41 compute-1 sudo[92067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:41 compute-1 python3.9[92069]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:41 compute-1 sudo[92067]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:42 compute-1 sudo[92219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzpbckgahvsoysiomyicwymzvscfqseg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247082.1503162-753-139806539581484/AnsiballZ_file.py'
Feb 16 13:04:42 compute-1 sudo[92219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:42 compute-1 python3.9[92221]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:42 compute-1 sudo[92219]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:43 compute-1 sudo[92371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mejmacraxquidwwxsvqkhhvijiobzjwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247082.9979649-769-268772639764196/AnsiballZ_stat.py'
Feb 16 13:04:43 compute-1 sudo[92371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:43 compute-1 python3.9[92373]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:43 compute-1 sudo[92371]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:43 compute-1 sudo[92449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeypqxgdchybrhsbsyvfgvwuftxvgoyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247082.9979649-769-268772639764196/AnsiballZ_file.py'
Feb 16 13:04:43 compute-1 sudo[92449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:43 compute-1 python3.9[92451]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:43 compute-1 sudo[92449]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:44 compute-1 sudo[92601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkrgbfdogxamfguxyzpkmkusrqjeopaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247084.0847826-793-171483368047210/AnsiballZ_stat.py'
Feb 16 13:04:44 compute-1 sudo[92601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:44 compute-1 python3.9[92603]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:44 compute-1 sudo[92601]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:44 compute-1 sudo[92679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufdjilozgyekvqosvkyshymvbtalhbvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247084.0847826-793-171483368047210/AnsiballZ_file.py'
Feb 16 13:04:44 compute-1 sudo[92679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:44 compute-1 python3.9[92681]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:45 compute-1 sudo[92679]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:45 compute-1 sudo[92831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqioeosahnjtpfddceepslgybeccpnzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247085.1572824-817-82879265054036/AnsiballZ_systemd.py'
Feb 16 13:04:45 compute-1 sudo[92831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:45 compute-1 python3.9[92833]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:04:45 compute-1 systemd[1]: Reloading.
Feb 16 13:04:45 compute-1 systemd-sysv-generator[92865]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:04:45 compute-1 systemd-rc-local-generator[92862]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:04:45 compute-1 sudo[92831]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:46 compute-1 sudo[93027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wredylcrwrqzumuvhkyjgybmzzamfvxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247086.420058-833-133188333283827/AnsiballZ_stat.py'
Feb 16 13:04:46 compute-1 sudo[93027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:46 compute-1 python3.9[93029]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:46 compute-1 sudo[93027]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:47 compute-1 sudo[93105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmwraokjdpwskwezehrzpgguwbercbsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247086.420058-833-133188333283827/AnsiballZ_file.py'
Feb 16 13:04:47 compute-1 sudo[93105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:47 compute-1 python3.9[93107]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:47 compute-1 sudo[93105]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:47 compute-1 sudo[93257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dopskbxxfntecxvfmziokzweinzhngvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247087.527778-857-275899032528945/AnsiballZ_stat.py'
Feb 16 13:04:47 compute-1 sudo[93257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:48 compute-1 python3.9[93259]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:48 compute-1 sudo[93257]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:48 compute-1 sudo[93335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqxwaqtilwrieozdupigmwfvgbvvxpup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247087.527778-857-275899032528945/AnsiballZ_file.py'
Feb 16 13:04:48 compute-1 sudo[93335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:48 compute-1 python3.9[93337]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:48 compute-1 sudo[93335]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:48 compute-1 sudo[93487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbgwjnqdtxoxzizgfdfqovnzvrysjtlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247088.5735025-881-82191760402337/AnsiballZ_systemd.py'
Feb 16 13:04:48 compute-1 sudo[93487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:49 compute-1 python3.9[93489]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:04:49 compute-1 systemd[1]: Reloading.
Feb 16 13:04:49 compute-1 systemd-rc-local-generator[93518]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:04:49 compute-1 systemd-sysv-generator[93521]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:04:49 compute-1 systemd[1]: Starting Create netns directory...
Feb 16 13:04:49 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 16 13:04:49 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 16 13:04:49 compute-1 systemd[1]: Finished Create netns directory.
Feb 16 13:04:49 compute-1 sudo[93487]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:49 compute-1 sudo[93687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cncdhsuxbbkafvybrsdvjnhdebxrxeaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247089.6629558-901-75643017167017/AnsiballZ_file.py'
Feb 16 13:04:49 compute-1 sudo[93687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:50 compute-1 python3.9[93689]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:50 compute-1 sudo[93687]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:50 compute-1 sudo[93839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oggjugrwifdbfovvapgdljrqymdwzrod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247090.298442-917-14811922349594/AnsiballZ_stat.py'
Feb 16 13:04:50 compute-1 sudo[93839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:50 compute-1 python3.9[93841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:50 compute-1 sudo[93839]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:51 compute-1 sudo[93962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fczwihufgdwwfniltqufddgrqothrcry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247090.298442-917-14811922349594/AnsiballZ_copy.py'
Feb 16 13:04:51 compute-1 sudo[93962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:51 compute-1 python3.9[93964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247090.298442-917-14811922349594/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:51 compute-1 sudo[93962]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:52 compute-1 sudo[94114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwsnvybhsviynniagyajeinbcexlneuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247091.9303098-951-30690168372834/AnsiballZ_file.py'
Feb 16 13:04:52 compute-1 sudo[94114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:52 compute-1 python3.9[94116]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:52 compute-1 sudo[94114]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:52 compute-1 sudo[94266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lufkwllvcwfryhngmdsblxillojwdojk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247092.5719347-967-40133208658288/AnsiballZ_file.py'
Feb 16 13:04:52 compute-1 sudo[94266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:52 compute-1 python3.9[94268]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:04:53 compute-1 sudo[94266]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:53 compute-1 sudo[94418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atulmykzviilpbfljboysfyywfcomiks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247093.2304647-983-69059428707328/AnsiballZ_stat.py'
Feb 16 13:04:53 compute-1 sudo[94418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:53 compute-1 python3.9[94420]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:04:53 compute-1 sudo[94418]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:54 compute-1 sudo[94541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwaubbvktyukpojqgjjbkjwxdmtsjgnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247093.2304647-983-69059428707328/AnsiballZ_copy.py'
Feb 16 13:04:54 compute-1 sudo[94541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:54 compute-1 python3.9[94543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247093.2304647-983-69059428707328/.source.json _original_basename=.flwabymj follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:54 compute-1 sudo[94541]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:55 compute-1 python3.9[94693]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:04:57 compute-1 sudo[95114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udnpltctixrpnqioeayisvwruxggzpvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247096.662753-1064-214855980654835/AnsiballZ_container_config_data.py'
Feb 16 13:04:57 compute-1 sudo[95114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:57 compute-1 python3.9[95116]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 16 13:04:57 compute-1 sudo[95114]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:58 compute-1 sudo[95266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgkhpzfkdxytziufthfxbyoeszbgivld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247097.6172628-1085-2684782854254/AnsiballZ_container_config_hash.py'
Feb 16 13:04:58 compute-1 sudo[95266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:58 compute-1 python3.9[95268]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:04:58 compute-1 sudo[95266]: pam_unix(sudo:session): session closed for user root
Feb 16 13:04:59 compute-1 sudo[95418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgihweifaklxullkbbvofhyuvavhmglj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247098.5509508-1105-58310905499028/AnsiballZ_edpm_container_manage.py'
Feb 16 13:04:59 compute-1 sudo[95418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:04:59 compute-1 python3[95420]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:04:59 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 13:04:59 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 13:04:59 compute-1 podman[95456]: 2026-02-16 13:04:59.710416151 +0000 UTC m=+0.043823907 container create 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 16 13:04:59 compute-1 podman[95456]: 2026-02-16 13:04:59.688672078 +0000 UTC m=+0.022079864 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 16 13:04:59 compute-1 python3[95420]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 16 13:04:59 compute-1 sudo[95418]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:00 compute-1 sudo[95645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfpwlbxvgcvgxtarnaowrusggvqznpud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247100.0171046-1121-202278089475092/AnsiballZ_stat.py'
Feb 16 13:05:00 compute-1 sudo[95645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:00 compute-1 python3.9[95647]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:00 compute-1 sudo[95645]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:00 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 13:05:00 compute-1 sudo[95799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itrkhqkgunxaqqugqxeaxqhcowwnrgbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247100.7190855-1139-183895985924203/AnsiballZ_file.py'
Feb 16 13:05:00 compute-1 sudo[95799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:01 compute-1 python3.9[95801]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:01 compute-1 sudo[95799]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:01 compute-1 sudo[95875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnjvkcvulmbhszbknfnnyduzuhxwyixw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247100.7190855-1139-183895985924203/AnsiballZ_stat.py'
Feb 16 13:05:01 compute-1 sudo[95875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:01 compute-1 python3.9[95877]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:01 compute-1 sudo[95875]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:02 compute-1 sudo[96026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzvbexuhfrrsyjuzmoilwylqoiirvlho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247101.6194565-1139-77044363752454/AnsiballZ_copy.py'
Feb 16 13:05:02 compute-1 sudo[96026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:02 compute-1 python3.9[96028]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247101.6194565-1139-77044363752454/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:02 compute-1 sudo[96026]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:02 compute-1 sudo[96102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmsxlfsadddkdxpkktcuvrpnbbcmcily ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247101.6194565-1139-77044363752454/AnsiballZ_systemd.py'
Feb 16 13:05:02 compute-1 sudo[96102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:02 compute-1 python3.9[96104]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:05:02 compute-1 systemd[1]: Reloading.
Feb 16 13:05:02 compute-1 systemd-rc-local-generator[96124]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:02 compute-1 systemd-sysv-generator[96131]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:02 compute-1 sudo[96102]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:03 compute-1 sudo[96220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sydhmcwcvroibrrszakxsoguztqthbfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247101.6194565-1139-77044363752454/AnsiballZ_systemd.py'
Feb 16 13:05:03 compute-1 sudo[96220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:03 compute-1 python3.9[96222]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:05:03 compute-1 systemd[1]: Reloading.
Feb 16 13:05:03 compute-1 systemd-sysv-generator[96252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:03 compute-1 systemd-rc-local-generator[96249]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:03 compute-1 systemd[1]: Starting ovn_controller container...
Feb 16 13:05:03 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb 16 13:05:03 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:05:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02e42e3ba4c174bf066101cd18d60d17cff10b421e5f3643246fb4d5b5f504f7/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 16 13:05:03 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1.
Feb 16 13:05:03 compute-1 podman[96270]: 2026-02-16 13:05:03.840739567 +0000 UTC m=+0.134102439 container init 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:05:03 compute-1 ovn_controller[96285]: + sudo -E kolla_set_configs
Feb 16 13:05:03 compute-1 podman[96270]: 2026-02-16 13:05:03.867522628 +0000 UTC m=+0.160885500 container start 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 13:05:03 compute-1 edpm-start-podman-container[96270]: ovn_controller
Feb 16 13:05:03 compute-1 systemd[1]: Created slice User Slice of UID 0.
Feb 16 13:05:03 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 16 13:05:03 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 16 13:05:03 compute-1 systemd[1]: Starting User Manager for UID 0...
Feb 16 13:05:03 compute-1 systemd[96321]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 16 13:05:03 compute-1 edpm-start-podman-container[96269]: Creating additional drop-in dependency for "ovn_controller" (6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1)
Feb 16 13:05:03 compute-1 podman[96291]: 2026-02-16 13:05:03.948945179 +0000 UTC m=+0.070455260 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 16 13:05:03 compute-1 systemd[1]: 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1-6835c552cba169bd.service: Main process exited, code=exited, status=1/FAILURE
Feb 16 13:05:03 compute-1 systemd[1]: 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1-6835c552cba169bd.service: Failed with result 'exit-code'.
Feb 16 13:05:03 compute-1 systemd[1]: Reloading.
Feb 16 13:05:04 compute-1 systemd-sysv-generator[96372]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:04 compute-1 systemd-rc-local-generator[96367]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:04 compute-1 systemd[96321]: Queued start job for default target Main User Target.
Feb 16 13:05:04 compute-1 systemd[96321]: Created slice User Application Slice.
Feb 16 13:05:04 compute-1 systemd[96321]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 16 13:05:04 compute-1 systemd[96321]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:05:04 compute-1 systemd[96321]: Reached target Paths.
Feb 16 13:05:04 compute-1 systemd[96321]: Reached target Timers.
Feb 16 13:05:04 compute-1 systemd[96321]: Starting D-Bus User Message Bus Socket...
Feb 16 13:05:04 compute-1 systemd[96321]: Starting Create User's Volatile Files and Directories...
Feb 16 13:05:04 compute-1 systemd[96321]: Finished Create User's Volatile Files and Directories.
Feb 16 13:05:04 compute-1 systemd[96321]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:05:04 compute-1 systemd[96321]: Reached target Sockets.
Feb 16 13:05:04 compute-1 systemd[96321]: Reached target Basic System.
Feb 16 13:05:04 compute-1 systemd[96321]: Reached target Main User Target.
Feb 16 13:05:04 compute-1 systemd[96321]: Startup finished in 136ms.
Feb 16 13:05:04 compute-1 systemd[1]: Started User Manager for UID 0.
Feb 16 13:05:04 compute-1 systemd[1]: Started ovn_controller container.
Feb 16 13:05:04 compute-1 systemd[1]: Started Session c1 of User root.
Feb 16 13:05:04 compute-1 sudo[96220]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:04 compute-1 ovn_controller[96285]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 13:05:04 compute-1 ovn_controller[96285]: INFO:__main__:Validating config file
Feb 16 13:05:04 compute-1 ovn_controller[96285]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 13:05:04 compute-1 ovn_controller[96285]: INFO:__main__:Writing out command to execute
Feb 16 13:05:04 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Feb 16 13:05:04 compute-1 ovn_controller[96285]: ++ cat /run_command
Feb 16 13:05:04 compute-1 ovn_controller[96285]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 16 13:05:04 compute-1 ovn_controller[96285]: + ARGS=
Feb 16 13:05:04 compute-1 ovn_controller[96285]: + sudo kolla_copy_cacerts
Feb 16 13:05:04 compute-1 systemd[1]: Started Session c2 of User root.
Feb 16 13:05:04 compute-1 ovn_controller[96285]: + [[ ! -n '' ]]
Feb 16 13:05:04 compute-1 ovn_controller[96285]: + . kolla_extend_start
Feb 16 13:05:04 compute-1 ovn_controller[96285]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb 16 13:05:04 compute-1 ovn_controller[96285]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 16 13:05:04 compute-1 ovn_controller[96285]: + umask 0022
Feb 16 13:05:04 compute-1 ovn_controller[96285]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb 16 13:05:04 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <info>  [1771247104.3360] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <info>  [1771247104.3366] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <warn>  [1771247104.3369] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <info>  [1771247104.3376] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <info>  [1771247104.3382] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <info>  [1771247104.3386] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 16 13:05:04 compute-1 kernel: br-int: entered promiscuous mode
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00014|main|INFO|OVS feature set changed, force recompute.
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00022|main|INFO|OVS feature set changed, force recompute.
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 13:05:04 compute-1 ovn_controller[96285]: 2026-02-16T13:05:04Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <info>  [1771247104.3628] manager: (ovn-b0e583-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <info>  [1771247104.3636] manager: (ovn-16940e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Feb 16 13:05:04 compute-1 systemd-udevd[96428]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:05:04 compute-1 systemd-udevd[96426]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:05:04 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <info>  [1771247104.3765] device (genev_sys_6081): carrier: link connected
Feb 16 13:05:04 compute-1 NetworkManager[56388]: <info>  [1771247104.3769] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Feb 16 13:05:05 compute-1 python3.9[96557]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:05:05 compute-1 sudo[96707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fstaxcxghibbhgorbskjiyzhvpjbicyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247105.5572312-1229-56242855449507/AnsiballZ_stat.py'
Feb 16 13:05:05 compute-1 sudo[96707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:06 compute-1 python3.9[96709]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:06 compute-1 sudo[96707]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:06 compute-1 sudo[96830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weqsbsssuetrmknbuwzddgthjejcihzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247105.5572312-1229-56242855449507/AnsiballZ_copy.py'
Feb 16 13:05:06 compute-1 sudo[96830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:06 compute-1 python3.9[96832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247105.5572312-1229-56242855449507/.source.yaml _original_basename=.eidq3cpi follow=False checksum=1ce770b9a19b4b0066c27b1fbba4d3923dbce27b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:06 compute-1 sudo[96830]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:06 compute-1 sudo[96982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqhgthbqsqwtcnnbgvxxvrdrdhwwtyfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247106.6829162-1259-202954529018811/AnsiballZ_command.py'
Feb 16 13:05:06 compute-1 sudo[96982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:07 compute-1 python3.9[96984]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:05:07 compute-1 ovs-vsctl[96987]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 16 13:05:07 compute-1 sudo[96982]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:07 compute-1 sshd-session[96985]: Connection closed by authenticating user root 146.190.226.24 port 43732 [preauth]
Feb 16 13:05:07 compute-1 sudo[97137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqmrltjwyuutigggtiygycdonmglejty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247107.271376-1275-80974283208913/AnsiballZ_command.py'
Feb 16 13:05:07 compute-1 sudo[97137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:07 compute-1 python3.9[97139]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:05:07 compute-1 ovs-vsctl[97141]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 16 13:05:07 compute-1 sudo[97137]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:08 compute-1 sudo[97292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szczkqsnmrrixahutmzwpsjmfawoxfhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247108.2154484-1303-222637829906833/AnsiballZ_command.py'
Feb 16 13:05:08 compute-1 sudo[97292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:08 compute-1 python3.9[97294]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:05:08 compute-1 ovs-vsctl[97295]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 16 13:05:08 compute-1 sudo[97292]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:09 compute-1 sshd-session[85780]: Connection closed by 192.168.122.30 port 47194
Feb 16 13:05:09 compute-1 sshd-session[85777]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:05:09 compute-1 systemd[1]: session-21.scope: Deactivated successfully.
Feb 16 13:05:09 compute-1 systemd[1]: session-21.scope: Consumed 40.952s CPU time.
Feb 16 13:05:09 compute-1 systemd-logind[821]: Session 21 logged out. Waiting for processes to exit.
Feb 16 13:05:09 compute-1 systemd-logind[821]: Removed session 21.
Feb 16 13:05:14 compute-1 systemd[1]: Stopping User Manager for UID 0...
Feb 16 13:05:14 compute-1 systemd[96321]: Activating special unit Exit the Session...
Feb 16 13:05:14 compute-1 systemd[96321]: Stopped target Main User Target.
Feb 16 13:05:14 compute-1 systemd[96321]: Stopped target Basic System.
Feb 16 13:05:14 compute-1 systemd[96321]: Stopped target Paths.
Feb 16 13:05:14 compute-1 systemd[96321]: Stopped target Sockets.
Feb 16 13:05:14 compute-1 systemd[96321]: Stopped target Timers.
Feb 16 13:05:14 compute-1 systemd[96321]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:05:14 compute-1 systemd[96321]: Closed D-Bus User Message Bus Socket.
Feb 16 13:05:14 compute-1 systemd[96321]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:05:14 compute-1 systemd[96321]: Removed slice User Application Slice.
Feb 16 13:05:14 compute-1 systemd[96321]: Reached target Shutdown.
Feb 16 13:05:14 compute-1 systemd[96321]: Finished Exit the Session.
Feb 16 13:05:14 compute-1 systemd[96321]: Reached target Exit the Session.
Feb 16 13:05:14 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Feb 16 13:05:14 compute-1 systemd[1]: Stopped User Manager for UID 0.
Feb 16 13:05:14 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 16 13:05:14 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 16 13:05:14 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 16 13:05:14 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 16 13:05:14 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Feb 16 13:05:15 compute-1 sshd-session[97322]: Accepted publickey for zuul from 192.168.122.30 port 32828 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:05:15 compute-1 systemd-logind[821]: New session 23 of user zuul.
Feb 16 13:05:15 compute-1 systemd[1]: Started Session 23 of User zuul.
Feb 16 13:05:15 compute-1 sshd-session[97322]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:05:16 compute-1 python3.9[97475]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:05:17 compute-1 sudo[97629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eanlkladkewahitkueqdyzegtwaoynkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247116.8999245-49-73265781351158/AnsiballZ_file.py'
Feb 16 13:05:17 compute-1 sudo[97629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:17 compute-1 python3.9[97631]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:17 compute-1 sudo[97629]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:18 compute-1 sudo[97781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blrkslvocpltncsggsswrbijaqlkbpbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247117.752214-49-281398031485304/AnsiballZ_file.py'
Feb 16 13:05:18 compute-1 sudo[97781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:18 compute-1 python3.9[97783]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:18 compute-1 sudo[97781]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:18 compute-1 sudo[97933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pahxmetustgmebzjmfdqwpcaodohelch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247118.4467819-49-240806676004567/AnsiballZ_file.py'
Feb 16 13:05:18 compute-1 sudo[97933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:18 compute-1 python3.9[97935]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:18 compute-1 sudo[97933]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:19 compute-1 sudo[98085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlwrfiztrfrhzwaoevwsooctgfiyikpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247119.1118786-49-259772920146301/AnsiballZ_file.py'
Feb 16 13:05:19 compute-1 sudo[98085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:19 compute-1 python3.9[98087]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:19 compute-1 sudo[98085]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:20 compute-1 sudo[98237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roxigzcmbllbunhdencjvduljmeuoolb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247119.8404837-49-138078569893920/AnsiballZ_file.py'
Feb 16 13:05:20 compute-1 sudo[98237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:20 compute-1 python3.9[98239]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:20 compute-1 sudo[98237]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:21 compute-1 python3.9[98389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:05:22 compute-1 sudo[98539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzdwagdttolkbrrkzgfserpzucvfcmve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247121.610187-137-232536376198886/AnsiballZ_seboolean.py'
Feb 16 13:05:22 compute-1 sudo[98539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:22 compute-1 python3.9[98541]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 16 13:05:22 compute-1 sudo[98539]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:23 compute-1 python3.9[98692]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:24 compute-1 python3.9[98813]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247123.0169973-153-229944084107677/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:25 compute-1 python3.9[98963]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:25 compute-1 python3.9[99084]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247124.508541-183-113846431571651/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:26 compute-1 sudo[99234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvpmdgnwyorbebfgkuquoaofzedznkbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247126.1099648-217-272663123090547/AnsiballZ_setup.py'
Feb 16 13:05:26 compute-1 sudo[99234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:26 compute-1 python3.9[99236]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:05:26 compute-1 sudo[99234]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:27 compute-1 sudo[99318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qecufjesgkdgmaxwbmqpygkqlkoiunkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247126.1099648-217-272663123090547/AnsiballZ_dnf.py'
Feb 16 13:05:27 compute-1 sudo[99318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:27 compute-1 python3.9[99320]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:05:28 compute-1 sudo[99318]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:29 compute-1 sudo[99471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znwnopwrgmjgkpfhcvabadtmglukkrox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247129.1159916-241-66179025233056/AnsiballZ_systemd.py'
Feb 16 13:05:29 compute-1 sudo[99471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:30 compute-1 python3.9[99473]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:05:30 compute-1 sudo[99471]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:30 compute-1 python3.9[99626]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:31 compute-1 python3.9[99747]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247130.2926495-257-125008377845082/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:31 compute-1 python3.9[99897]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:32 compute-1 python3.9[100018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247131.301002-257-35269260067774/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:33 compute-1 python3.9[100168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:34 compute-1 python3.9[100289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247133.224092-345-268146114091214/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:34 compute-1 ovn_controller[96285]: 2026-02-16T13:05:34Z|00025|memory|INFO|16128 kB peak resident set size after 29.9 seconds
Feb 16 13:05:34 compute-1 ovn_controller[96285]: 2026-02-16T13:05:34Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Feb 16 13:05:34 compute-1 podman[100290]: 2026-02-16 13:05:34.305572698 +0000 UTC m=+0.130095228 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 16 13:05:34 compute-1 python3.9[100465]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:35 compute-1 python3.9[100586]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247134.3307493-345-14311557895674/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:35 compute-1 python3.9[100736]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:36 compute-1 sudo[100888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxddmwpfejrlpueqlmbiupgknllfmgbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247136.1195886-421-204773397338246/AnsiballZ_file.py'
Feb 16 13:05:36 compute-1 sudo[100888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:36 compute-1 python3.9[100890]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:36 compute-1 sudo[100888]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:37 compute-1 sudo[101040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvsdpesafqgqqucpmxnaebbgwmdktwwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247136.8201253-437-58530055198540/AnsiballZ_stat.py'
Feb 16 13:05:37 compute-1 sudo[101040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:37 compute-1 python3.9[101042]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:37 compute-1 sudo[101040]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:37 compute-1 sudo[101118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cumbtayfishoogvhsvuwavbtdglqfctq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247136.8201253-437-58530055198540/AnsiballZ_file.py'
Feb 16 13:05:37 compute-1 sudo[101118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:37 compute-1 python3.9[101120]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:37 compute-1 sudo[101118]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:38 compute-1 sudo[101270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okdezeylikhntkwhywnxcshrjircrpel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247137.8013432-437-48859245412581/AnsiballZ_stat.py'
Feb 16 13:05:38 compute-1 sudo[101270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:38 compute-1 python3.9[101272]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:38 compute-1 sudo[101270]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:38 compute-1 sudo[101348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydzxsbxnnuitlayezzzdmpfbadokzwzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247137.8013432-437-48859245412581/AnsiballZ_file.py'
Feb 16 13:05:38 compute-1 sudo[101348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:38 compute-1 python3.9[101350]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:38 compute-1 sudo[101348]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:39 compute-1 sudo[101500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaaajhhbftopamxcqpstbuztrzlsygsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247138.988988-483-48698551800714/AnsiballZ_file.py'
Feb 16 13:05:39 compute-1 sudo[101500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:39 compute-1 python3.9[101502]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:39 compute-1 sudo[101500]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:39 compute-1 sudo[101652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efkncmlmvmgujvaldtbzlxvqxeufpesa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247139.6726766-499-81858201772667/AnsiballZ_stat.py'
Feb 16 13:05:39 compute-1 sudo[101652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:40 compute-1 python3.9[101654]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:40 compute-1 sudo[101652]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:40 compute-1 sudo[101730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyuxwxszzjwvtnhdlcidsixoxdzvgaao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247139.6726766-499-81858201772667/AnsiballZ_file.py'
Feb 16 13:05:40 compute-1 sudo[101730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:40 compute-1 python3.9[101732]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:40 compute-1 sudo[101730]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:41 compute-1 sudo[101882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlmjuwmhdrjqjowtcdkjwngwmnefmuau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247140.8087888-523-204719207022858/AnsiballZ_stat.py'
Feb 16 13:05:41 compute-1 sudo[101882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:41 compute-1 python3.9[101884]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:41 compute-1 sudo[101882]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:41 compute-1 sudo[101960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdvcbvjssbpmegmjclwitzdcryrakmuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247140.8087888-523-204719207022858/AnsiballZ_file.py'
Feb 16 13:05:41 compute-1 sudo[101960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:41 compute-1 python3.9[101962]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:41 compute-1 sudo[101960]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:42 compute-1 sudo[102112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sftxlacnndhbcjfkedikqmwxabpdwtbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247141.86511-547-215734996029028/AnsiballZ_systemd.py'
Feb 16 13:05:42 compute-1 sudo[102112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:42 compute-1 python3.9[102114]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:05:42 compute-1 systemd[1]: Reloading.
Feb 16 13:05:42 compute-1 systemd-rc-local-generator[102136]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:42 compute-1 systemd-sysv-generator[102143]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:42 compute-1 sudo[102112]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:43 compute-1 sudo[102307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzyioekaluitbfxrhrqwzamkpjksqkti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247142.9611177-563-94659969181628/AnsiballZ_stat.py'
Feb 16 13:05:43 compute-1 sudo[102307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:43 compute-1 python3.9[102309]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:43 compute-1 sudo[102307]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:43 compute-1 sudo[102385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfqthvodvnozwflthkialeyhwxmwvwbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247142.9611177-563-94659969181628/AnsiballZ_file.py'
Feb 16 13:05:43 compute-1 sudo[102385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:43 compute-1 python3.9[102387]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:43 compute-1 sudo[102385]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:44 compute-1 sudo[102537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwhmbsaqbrfgarvgyekaghpaivccyrrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247144.1417933-587-209363533600422/AnsiballZ_stat.py'
Feb 16 13:05:44 compute-1 sudo[102537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:44 compute-1 python3.9[102539]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:44 compute-1 sudo[102537]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:44 compute-1 sudo[102615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqjvuxrynnyyexjdqtgsavpesocriwlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247144.1417933-587-209363533600422/AnsiballZ_file.py'
Feb 16 13:05:44 compute-1 sudo[102615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:45 compute-1 python3.9[102617]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:45 compute-1 sudo[102615]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:45 compute-1 sudo[102767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swqgkhcvljsefblweqnrdxxpxzofxcbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247145.2461483-611-47083980847670/AnsiballZ_systemd.py'
Feb 16 13:05:45 compute-1 sudo[102767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:45 compute-1 python3.9[102769]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:05:45 compute-1 systemd[1]: Reloading.
Feb 16 13:05:45 compute-1 systemd-sysv-generator[102801]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:05:45 compute-1 systemd-rc-local-generator[102797]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:05:46 compute-1 systemd[1]: Starting Create netns directory...
Feb 16 13:05:46 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 16 13:05:46 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 16 13:05:46 compute-1 systemd[1]: Finished Create netns directory.
Feb 16 13:05:46 compute-1 sudo[102767]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:46 compute-1 sudo[102969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skfzgtpavloybbmgpfajiqajdmwvhwvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247146.3880708-631-156935103574063/AnsiballZ_file.py'
Feb 16 13:05:46 compute-1 sudo[102969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:46 compute-1 python3.9[102971]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:46 compute-1 sudo[102969]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:47 compute-1 sudo[103121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grhmzeyzzjfbfykkhbyxjfofvloekkap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247147.0672932-647-54763439861660/AnsiballZ_stat.py'
Feb 16 13:05:47 compute-1 sudo[103121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:47 compute-1 python3.9[103123]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:47 compute-1 sudo[103121]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:47 compute-1 sudo[103244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiccbgnfxrreyrlfzoldazvheodluqyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247147.0672932-647-54763439861660/AnsiballZ_copy.py'
Feb 16 13:05:47 compute-1 sudo[103244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:48 compute-1 python3.9[103246]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247147.0672932-647-54763439861660/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:48 compute-1 sudo[103244]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:48 compute-1 sudo[103396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjgkjhlrdvvnijymrwfixankkwbdfdjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247148.6266346-681-280023223675784/AnsiballZ_file.py'
Feb 16 13:05:48 compute-1 sudo[103396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:49 compute-1 python3.9[103398]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:49 compute-1 sudo[103396]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:49 compute-1 sudo[103550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibzghnwhpgzhlwknkzynlpkjefnygfnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247149.3488133-697-194551133442557/AnsiballZ_file.py'
Feb 16 13:05:49 compute-1 sudo[103550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:49 compute-1 python3.9[103552]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:05:49 compute-1 sudo[103550]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:50 compute-1 sudo[103702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjseammhrtirvpjqbqijzncjqrcdmlca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247149.9958134-713-231831597913639/AnsiballZ_stat.py'
Feb 16 13:05:50 compute-1 sudo[103702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:50 compute-1 sshd-session[103423]: Invalid user solana from 2.57.122.210 port 51004
Feb 16 13:05:50 compute-1 sshd-session[103423]: Connection closed by invalid user solana 2.57.122.210 port 51004 [preauth]
Feb 16 13:05:50 compute-1 python3.9[103704]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:05:50 compute-1 sudo[103702]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:50 compute-1 sudo[103825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nchupfliyvnjriprjouedujzpjncdmuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247149.9958134-713-231831597913639/AnsiballZ_copy.py'
Feb 16 13:05:50 compute-1 sudo[103825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:50 compute-1 python3.9[103827]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247149.9958134-713-231831597913639/.source.json _original_basename=.y7kz_g8l follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:50 compute-1 sudo[103825]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:51 compute-1 python3.9[103977]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:53 compute-1 sudo[104398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkqnbyttxucteegrmbknjpyingagezhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247153.5435393-793-125979223928911/AnsiballZ_container_config_data.py'
Feb 16 13:05:53 compute-1 sudo[104398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:54 compute-1 python3.9[104400]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 16 13:05:54 compute-1 sudo[104398]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:55 compute-1 sudo[104550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkgdvdokylrtnwqpelzjymqzscwsurxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247154.631761-815-68899996825969/AnsiballZ_container_config_hash.py'
Feb 16 13:05:55 compute-1 sudo[104550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:55 compute-1 python3.9[104552]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:05:55 compute-1 sudo[104550]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:56 compute-1 sudo[104702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bucdwxacalfclbsajyskplbujyhbdays ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247155.6539328-835-227075535561429/AnsiballZ_edpm_container_manage.py'
Feb 16 13:05:56 compute-1 sudo[104702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:56 compute-1 python3[104704]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:05:56 compute-1 podman[104739]: 2026-02-16 13:05:56.490463778 +0000 UTC m=+0.051509867 container create 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Feb 16 13:05:56 compute-1 podman[104739]: 2026-02-16 13:05:56.462458886 +0000 UTC m=+0.023504995 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:05:56 compute-1 python3[104704]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:05:56 compute-1 sudo[104702]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:57 compute-1 sudo[104928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsjkghyyytrziseracumhvemsrazxyey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247156.8867276-851-185607298163538/AnsiballZ_stat.py'
Feb 16 13:05:57 compute-1 sudo[104928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:57 compute-1 python3.9[104930]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:57 compute-1 sudo[104928]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:58 compute-1 sudo[105082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdchbuqyshorwdzwxkqorfnzdnmrrnqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247157.9229305-869-152672668168356/AnsiballZ_file.py'
Feb 16 13:05:58 compute-1 sudo[105082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:58 compute-1 python3.9[105084]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:58 compute-1 sudo[105082]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:58 compute-1 sudo[105158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymiijmrubktiqwpkqxfnkufqdmrhyctc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247157.9229305-869-152672668168356/AnsiballZ_stat.py'
Feb 16 13:05:58 compute-1 sudo[105158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:58 compute-1 python3.9[105160]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:05:58 compute-1 sudo[105158]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:59 compute-1 sudo[105309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfkhqgercombcrucqpzmfwtevrlwnuhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247158.8639886-869-66824228110304/AnsiballZ_copy.py'
Feb 16 13:05:59 compute-1 sudo[105309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:05:59 compute-1 python3.9[105311]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247158.8639886-869-66824228110304/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:05:59 compute-1 sudo[105309]: pam_unix(sudo:session): session closed for user root
Feb 16 13:05:59 compute-1 sudo[105385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izkvrycviihgkrtdakkktbwvnjzhsbqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247158.8639886-869-66824228110304/AnsiballZ_systemd.py'
Feb 16 13:05:59 compute-1 sudo[105385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:00 compute-1 python3.9[105387]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:06:00 compute-1 systemd[1]: Reloading.
Feb 16 13:06:00 compute-1 systemd-rc-local-generator[105411]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:00 compute-1 systemd-sysv-generator[105415]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:00 compute-1 sudo[105385]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:00 compute-1 sudo[105502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwsrjeaqlndvzqpllvlehpwikbnjbsqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247158.8639886-869-66824228110304/AnsiballZ_systemd.py'
Feb 16 13:06:00 compute-1 sudo[105502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:00 compute-1 python3.9[105504]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:00 compute-1 systemd[1]: Reloading.
Feb 16 13:06:00 compute-1 systemd-rc-local-generator[105535]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:00 compute-1 systemd-sysv-generator[105543]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:01 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Feb 16 13:06:01 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:06:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fb20bd18d0e6e29d196efe75ba3f06e210f086e4fdc2966794f4f16589db6fc/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 16 13:06:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fb20bd18d0e6e29d196efe75ba3f06e210f086e4fdc2966794f4f16589db6fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:06:01 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879.
Feb 16 13:06:01 compute-1 podman[105552]: 2026-02-16 13:06:01.225529818 +0000 UTC m=+0.114385983 container init 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: + sudo -E kolla_set_configs
Feb 16 13:06:01 compute-1 podman[105552]: 2026-02-16 13:06:01.25068685 +0000 UTC m=+0.139543005 container start 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:06:01 compute-1 edpm-start-podman-container[105552]: ovn_metadata_agent
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Validating config file
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Copying service configuration files
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Writing out command to execute
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: ++ cat /run_command
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: + CMD=neutron-ovn-metadata-agent
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: + ARGS=
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: + sudo kolla_copy_cacerts
Feb 16 13:06:01 compute-1 edpm-start-podman-container[105551]: Creating additional drop-in dependency for "ovn_metadata_agent" (6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879)
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: + [[ ! -n '' ]]
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: + . kolla_extend_start
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: Running command: 'neutron-ovn-metadata-agent'
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: + umask 0022
Feb 16 13:06:01 compute-1 ovn_metadata_agent[105568]: + exec neutron-ovn-metadata-agent
Feb 16 13:06:01 compute-1 systemd[1]: Reloading.
Feb 16 13:06:01 compute-1 podman[105575]: 2026-02-16 13:06:01.333233792 +0000 UTC m=+0.071612241 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:06:01 compute-1 systemd-rc-local-generator[105652]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:01 compute-1 systemd-sysv-generator[105657]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:01 compute-1 systemd[1]: Started ovn_metadata_agent container.
Feb 16 13:06:01 compute-1 sudo[105502]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:02 compute-1 python3.9[105811]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.257 105573 INFO neutron.common.config [-] Logging enabled!
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.257 105573 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.257 105573 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.258 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.258 105573 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.258 105573 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.258 105573 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.259 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.259 105573 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.259 105573 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.259 105573 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.259 105573 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.259 105573 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.259 105573 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.260 105573 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.260 105573 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.260 105573 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.260 105573 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.260 105573 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.260 105573 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.260 105573 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.260 105573 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.260 105573 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.261 105573 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.261 105573 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.261 105573 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.261 105573 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.261 105573 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.261 105573 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.261 105573 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.261 105573 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.262 105573 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.262 105573 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.262 105573 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.262 105573 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.262 105573 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.262 105573 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.262 105573 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.263 105573 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.264 105573 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.264 105573 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.264 105573 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.264 105573 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.264 105573 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.264 105573 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.264 105573 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.264 105573 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.264 105573 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.265 105573 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.265 105573 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.265 105573 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.265 105573 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.265 105573 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.265 105573 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.265 105573 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.265 105573 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.266 105573 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.266 105573 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.266 105573 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.266 105573 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.266 105573 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.266 105573 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.266 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.267 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.267 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.267 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.267 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.267 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.267 105573 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.267 105573 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.267 105573 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.267 105573 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.268 105573 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.268 105573 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.268 105573 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.268 105573 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.268 105573 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.268 105573 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.268 105573 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.268 105573 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.268 105573 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.269 105573 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.269 105573 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.269 105573 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.269 105573 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.269 105573 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.269 105573 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.269 105573 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.269 105573 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.269 105573 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.270 105573 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.270 105573 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.270 105573 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.270 105573 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.270 105573 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.270 105573 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.270 105573 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.270 105573 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.271 105573 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.271 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.271 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.271 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.271 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.271 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.271 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.271 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.272 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.272 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.272 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.272 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.272 105573 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.272 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.272 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.272 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.273 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.273 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.273 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.273 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.273 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.273 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.273 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.273 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.273 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.274 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.274 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.274 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.274 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.274 105573 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.274 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.274 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.274 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.275 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.275 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.275 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.275 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.275 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.275 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.275 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.275 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.275 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.276 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.277 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.277 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.277 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.277 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.277 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.277 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.277 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.277 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.277 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.278 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.278 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.278 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.278 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.278 105573 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.278 105573 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.278 105573 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.278 105573 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.278 105573 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.279 105573 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.279 105573 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.279 105573 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.279 105573 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.279 105573 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.279 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.279 105573 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.279 105573 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.279 105573 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.280 105573 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.280 105573 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.280 105573 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.280 105573 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.280 105573 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.280 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.280 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.280 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.281 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.281 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.281 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.281 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.281 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.281 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.281 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.281 105573 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.281 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.282 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.282 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.282 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.282 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.282 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.282 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.282 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.282 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.282 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.283 105573 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.283 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.283 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.283 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.283 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.283 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.283 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.283 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.284 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.284 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.284 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.284 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.284 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.284 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.284 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.284 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.284 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.285 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.285 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.285 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.285 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.285 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.285 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.285 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.285 105573 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.286 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.286 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.286 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.286 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.286 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.286 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.286 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.286 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.286 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.287 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.287 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.287 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.287 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.287 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.287 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.287 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.287 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.288 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.288 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.288 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.288 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.288 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.288 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.288 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.288 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.288 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.289 105573 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.289 105573 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.289 105573 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.289 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.289 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.289 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.289 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.289 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.289 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.290 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.290 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.290 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.290 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.290 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.290 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.290 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.290 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.290 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.291 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.291 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.291 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.291 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.291 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.291 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.291 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.291 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.291 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.292 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.292 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.292 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.292 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.292 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.292 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.292 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.292 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.292 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.293 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.293 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.293 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.293 105573 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.293 105573 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.304 105573 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.304 105573 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.304 105573 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.305 105573 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.305 105573 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.319 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 54c1a259-778a-4222-b2c6-8422ea19a065 (UUID: 54c1a259-778a-4222-b2c6-8422ea19a065) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.345 105573 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.346 105573 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.346 105573 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.346 105573 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.349 105573 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.355 105573 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.365 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '54c1a259-778a-4222-b2c6-8422ea19a065'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], external_ids={}, name=54c1a259-778a-4222-b2c6-8422ea19a065, nb_cfg_timestamp=1771247112354, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.366 105573 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f4c895d6a30>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.367 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.368 105573 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.368 105573 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.368 105573 INFO oslo_service.service [-] Starting 1 workers
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.372 105573 DEBUG oslo_service.service [-] Started child 105911 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.375 105911 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-892811'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.375 105573 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpvjdzkvhk/privsep.sock']
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.402 105911 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.402 105911 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.403 105911 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.406 105911 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.412 105911 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 16 13:06:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.418 105911 INFO eventlet.wsgi.server [-] (105911) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 16 13:06:03 compute-1 sudo[105965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nneihoxeyjeayqwzjkjwqjxtbqwubpue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247163.2276754-959-118909171356574/AnsiballZ_stat.py'
Feb 16 13:06:03 compute-1 sudo[105965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:03 compute-1 python3.9[105967]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:06:03 compute-1 sudo[105965]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:03 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 16 13:06:03 compute-1 sudo[106092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aehmqgrfbkmkokhkvekokclbqeovkczh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247163.2276754-959-118909171356574/AnsiballZ_copy.py'
Feb 16 13:06:03 compute-1 sudo[106092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:04.047 105573 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:04.048 105573 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpvjdzkvhk/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.903 106042 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.906 106042 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.908 106042 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:03.908 106042 INFO oslo.privsep.daemon [-] privsep daemon running as pid 106042
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:04.051 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[0946abd1-bcb1-4e6f-a526-8507971d4b05]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:06:04 compute-1 python3.9[106094]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247163.2276754-959-118909171356574/.source.yaml _original_basename=.3zl7hdet follow=False checksum=e321a77da89f80ab6ea1e75aee3ae5cf00c93c93 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:04 compute-1 sudo[106092]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:04.544 106042 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:04.544 106042 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:06:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:04.544 106042 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:06:04 compute-1 sshd-session[97325]: Connection closed by 192.168.122.30 port 32828
Feb 16 13:06:04 compute-1 sshd-session[97322]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:06:04 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Feb 16 13:06:04 compute-1 systemd[1]: session-23.scope: Consumed 30.110s CPU time.
Feb 16 13:06:04 compute-1 systemd-logind[821]: Session 23 logged out. Waiting for processes to exit.
Feb 16 13:06:04 compute-1 systemd-logind[821]: Removed session 23.
Feb 16 13:06:04 compute-1 podman[106123]: 2026-02-16 13:06:04.946790581 +0000 UTC m=+0.079965317 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.114 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[941241e5-0441-482b-98ba-62256677a52e]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.116 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, column=external_ids, values=({'neutron:ovn-metadata-id': 'ea799fc1-aabd-5ae5-a1ab-884d7dce8316'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.132 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.143 105573 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.144 105573 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.144 105573 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.144 105573 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.144 105573 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.144 105573 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.144 105573 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.145 105573 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.145 105573 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.145 105573 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.145 105573 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.145 105573 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.145 105573 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.145 105573 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.146 105573 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.146 105573 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.146 105573 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.146 105573 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.146 105573 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.146 105573 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.146 105573 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.147 105573 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.147 105573 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.147 105573 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.147 105573 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.147 105573 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.148 105573 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.148 105573 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.148 105573 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.148 105573 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.148 105573 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.148 105573 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.148 105573 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.149 105573 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.149 105573 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.149 105573 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.149 105573 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.149 105573 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.149 105573 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.149 105573 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.150 105573 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.150 105573 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.150 105573 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.150 105573 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.150 105573 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.150 105573 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.150 105573 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.150 105573 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.150 105573 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.151 105573 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.151 105573 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.151 105573 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.151 105573 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.151 105573 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.151 105573 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.151 105573 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.152 105573 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.152 105573 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.152 105573 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.152 105573 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.152 105573 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.152 105573 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.152 105573 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.152 105573 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.153 105573 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.153 105573 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.153 105573 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.153 105573 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.153 105573 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.153 105573 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.153 105573 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.154 105573 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.154 105573 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.154 105573 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.154 105573 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.154 105573 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.154 105573 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.154 105573 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.154 105573 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.155 105573 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.155 105573 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.155 105573 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.155 105573 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.155 105573 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.155 105573 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.155 105573 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.155 105573 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.156 105573 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.156 105573 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.156 105573 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.156 105573 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.156 105573 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.156 105573 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.156 105573 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.156 105573 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.157 105573 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.157 105573 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.157 105573 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.157 105573 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.157 105573 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.157 105573 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.157 105573 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.157 105573 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.158 105573 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.158 105573 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.158 105573 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.158 105573 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.158 105573 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.158 105573 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.158 105573 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.159 105573 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.159 105573 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.159 105573 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.159 105573 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.159 105573 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.160 105573 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.160 105573 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.160 105573 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.160 105573 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.160 105573 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.160 105573 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.161 105573 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.161 105573 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.161 105573 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.161 105573 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.161 105573 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.161 105573 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.161 105573 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.162 105573 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.162 105573 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.162 105573 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.162 105573 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.162 105573 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.162 105573 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.162 105573 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.163 105573 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.163 105573 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.163 105573 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.163 105573 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.163 105573 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.163 105573 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.163 105573 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.164 105573 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.164 105573 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.164 105573 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.164 105573 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.164 105573 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.164 105573 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.164 105573 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.165 105573 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.165 105573 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.165 105573 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.165 105573 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.165 105573 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.165 105573 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.165 105573 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.165 105573 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.166 105573 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.166 105573 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.166 105573 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.166 105573 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.166 105573 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.166 105573 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.166 105573 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.166 105573 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.167 105573 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.167 105573 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.167 105573 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.167 105573 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.167 105573 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.167 105573 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.167 105573 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.167 105573 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.168 105573 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.168 105573 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.168 105573 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.168 105573 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.168 105573 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.168 105573 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.168 105573 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.168 105573 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.168 105573 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.169 105573 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.169 105573 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.169 105573 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.169 105573 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.169 105573 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.169 105573 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.169 105573 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.170 105573 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.170 105573 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.170 105573 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.170 105573 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.170 105573 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.170 105573 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.171 105573 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.171 105573 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.171 105573 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.171 105573 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.171 105573 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.171 105573 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.171 105573 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.171 105573 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.172 105573 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.172 105573 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.172 105573 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.172 105573 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.172 105573 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.172 105573 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.172 105573 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.172 105573 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.172 105573 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.173 105573 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.173 105573 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.173 105573 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.173 105573 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.173 105573 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.173 105573 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.173 105573 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.173 105573 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.173 105573 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.174 105573 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.174 105573 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.174 105573 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.174 105573 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.174 105573 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.174 105573 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.174 105573 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.174 105573 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.174 105573 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.175 105573 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.175 105573 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.175 105573 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.175 105573 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.175 105573 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.175 105573 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.175 105573 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.175 105573 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.176 105573 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.176 105573 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.176 105573 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.176 105573 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.176 105573 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.176 105573 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.176 105573 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.177 105573 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.177 105573 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.177 105573 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.177 105573 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.177 105573 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.177 105573 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.177 105573 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.177 105573 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.178 105573 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.178 105573 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.178 105573 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.178 105573 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.178 105573 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.178 105573 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.178 105573 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.178 105573 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.179 105573 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.179 105573 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.179 105573 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.179 105573 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.179 105573 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.179 105573 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.179 105573 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.180 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.180 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.180 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.180 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.180 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.180 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.180 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.180 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.181 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.182 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.182 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.182 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.182 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.182 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.182 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.182 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.182 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.182 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.183 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.183 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.183 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.183 105573 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.183 105573 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.183 105573 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.183 105573 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.183 105573 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:06:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:06:05.183 105573 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 13:06:08 compute-1 sshd-session[106150]: Connection closed by authenticating user root 146.190.226.24 port 43032 [preauth]
Feb 16 13:06:10 compute-1 sshd-session[106152]: Accepted publickey for zuul from 192.168.122.30 port 59316 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:06:10 compute-1 systemd-logind[821]: New session 24 of user zuul.
Feb 16 13:06:10 compute-1 systemd[1]: Started Session 24 of User zuul.
Feb 16 13:06:10 compute-1 sshd-session[106152]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:06:11 compute-1 python3.9[106305]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:06:12 compute-1 sudo[106459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owjksqfixchoryiuunonxzxqlkvftnub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247171.7517643-49-59860918776042/AnsiballZ_command.py'
Feb 16 13:06:12 compute-1 sudo[106459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:12 compute-1 python3.9[106461]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:12 compute-1 sudo[106459]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:14 compute-1 sudo[106624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqmebxelmeipfbtirkwiyuskutllhqwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247173.9102566-71-14731439949083/AnsiballZ_systemd_service.py'
Feb 16 13:06:14 compute-1 sudo[106624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:14 compute-1 python3.9[106626]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:06:14 compute-1 systemd[1]: Reloading.
Feb 16 13:06:14 compute-1 systemd-rc-local-generator[106647]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:14 compute-1 systemd-sysv-generator[106651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:14 compute-1 sudo[106624]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:15 compute-1 python3.9[106818]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:06:15 compute-1 network[106835]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:06:15 compute-1 network[106836]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:06:15 compute-1 network[106837]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:06:18 compute-1 sudo[107097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlocaapbquhbdybjyezsyctmqnorchbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247178.3146107-109-215678076716499/AnsiballZ_systemd_service.py'
Feb 16 13:06:18 compute-1 sudo[107097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:20 compute-1 python3.9[107099]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:20 compute-1 sudo[107097]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:20 compute-1 sudo[107250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkahbkhxgohfizgrobamlvtiwdmagmay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247180.393775-109-167722362647118/AnsiballZ_systemd_service.py'
Feb 16 13:06:20 compute-1 sudo[107250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:20 compute-1 python3.9[107252]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:20 compute-1 sudo[107250]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:21 compute-1 sudo[107403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjxghltbjmrhypvwbnxepjezpnmsoebx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247181.001507-109-89140515742307/AnsiballZ_systemd_service.py'
Feb 16 13:06:21 compute-1 sudo[107403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:21 compute-1 python3.9[107405]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:21 compute-1 sudo[107403]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:21 compute-1 sudo[107556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmvfuwyzcfxrmiwtxkcmndifhremdzet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247181.601548-109-5659409959522/AnsiballZ_systemd_service.py'
Feb 16 13:06:21 compute-1 sudo[107556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:22 compute-1 python3.9[107558]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:22 compute-1 sudo[107556]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:22 compute-1 sudo[107709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iragbnopvngattmnnhadviovdmaqjgcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247182.2692063-109-3860778017453/AnsiballZ_systemd_service.py'
Feb 16 13:06:22 compute-1 sudo[107709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:22 compute-1 python3.9[107711]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:22 compute-1 sudo[107709]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:23 compute-1 sudo[107862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvkrgwajymlzvfwhldanmummygpgtbkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247182.9293668-109-69867272241283/AnsiballZ_systemd_service.py'
Feb 16 13:06:23 compute-1 sudo[107862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:23 compute-1 python3.9[107864]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:23 compute-1 sudo[107862]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:24 compute-1 sudo[108015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqnuugcandkrcvluvswjnpysculehiqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247184.0553246-109-95692274579161/AnsiballZ_systemd_service.py'
Feb 16 13:06:24 compute-1 sudo[108015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:24 compute-1 python3.9[108017]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:06:24 compute-1 sudo[108015]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:25 compute-1 sudo[108168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgncbzpnuxnusyfzhyutaknfmzqoladi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247184.9152887-213-96752122389663/AnsiballZ_file.py'
Feb 16 13:06:25 compute-1 sudo[108168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:25 compute-1 python3.9[108170]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:25 compute-1 sudo[108168]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:25 compute-1 sudo[108320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrzbgejqkekrwvlclmjmxyzxhfqbzoli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247185.6165564-213-44760236288260/AnsiballZ_file.py'
Feb 16 13:06:25 compute-1 sudo[108320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:26 compute-1 python3.9[108322]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:26 compute-1 sudo[108320]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:26 compute-1 sudo[108472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boutzmqdmzoikmyxgauuidbgagziqjwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247186.1647978-213-224373279940045/AnsiballZ_file.py'
Feb 16 13:06:26 compute-1 sudo[108472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:26 compute-1 python3.9[108474]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:26 compute-1 sudo[108472]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:26 compute-1 sudo[108624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsxzwcvlsfbhdyuoctwkgurwnsmiwnnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247186.6679678-213-172720331019424/AnsiballZ_file.py'
Feb 16 13:06:26 compute-1 sudo[108624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:27 compute-1 python3.9[108626]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:27 compute-1 sudo[108624]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:27 compute-1 sudo[108776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktamaafwcshvefzdxuhcadyaohinfies ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247187.1776814-213-152836592541423/AnsiballZ_file.py'
Feb 16 13:06:27 compute-1 sudo[108776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:27 compute-1 python3.9[108778]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:27 compute-1 sudo[108776]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:27 compute-1 sudo[108928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgeohvepapnpfobfheqfryuotborsniw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247187.6975696-213-222593773129448/AnsiballZ_file.py'
Feb 16 13:06:27 compute-1 sudo[108928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:28 compute-1 python3.9[108930]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:28 compute-1 sudo[108928]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:28 compute-1 sudo[109080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaorgvzazefqarkaovglzmezqlmckvyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247188.362926-213-108807155297406/AnsiballZ_file.py'
Feb 16 13:06:28 compute-1 sudo[109080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:28 compute-1 python3.9[109082]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:28 compute-1 sudo[109080]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:29 compute-1 sudo[109232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odwmcdlpqsckjkzmfxkuwalhtnsioife ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247189.1935644-313-12354060290171/AnsiballZ_file.py'
Feb 16 13:06:29 compute-1 sudo[109232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:29 compute-1 python3.9[109234]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:29 compute-1 sudo[109232]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:29 compute-1 sudo[109384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-futckgfyjaqrznnrgpawxjuaphtjgwmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247189.7543094-313-11892799529556/AnsiballZ_file.py'
Feb 16 13:06:29 compute-1 sudo[109384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:30 compute-1 python3.9[109386]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:30 compute-1 sudo[109384]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:30 compute-1 sudo[109536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jriktzulncwfcufmwlqomyajiacpqgga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247190.2784953-313-107052880736891/AnsiballZ_file.py'
Feb 16 13:06:30 compute-1 sudo[109536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:30 compute-1 python3.9[109538]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:30 compute-1 sudo[109536]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:31 compute-1 sudo[109688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nifwfdfnveroptdmimzmwzqmqpaptwby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247190.826146-313-62343504699402/AnsiballZ_file.py'
Feb 16 13:06:31 compute-1 sudo[109688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:31 compute-1 python3.9[109690]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:31 compute-1 sudo[109688]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:31 compute-1 sudo[109850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhewifemkicgwqjmizayndpgquisoohb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247191.3819032-313-98603382656100/AnsiballZ_file.py'
Feb 16 13:06:31 compute-1 sudo[109850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:31 compute-1 podman[109814]: 2026-02-16 13:06:31.654973503 +0000 UTC m=+0.060472353 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent)
Feb 16 13:06:31 compute-1 python3.9[109852]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:31 compute-1 sudo[109850]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:32 compute-1 sudo[110011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxqwqongcqynmdxcezhjaumlsqxddtgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247191.9384203-313-90636786968282/AnsiballZ_file.py'
Feb 16 13:06:32 compute-1 sudo[110011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:32 compute-1 python3.9[110013]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:32 compute-1 sudo[110011]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:32 compute-1 sudo[110163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owtvcewevckxvjfxjrwknfotfznretcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247192.4611084-313-250429928964424/AnsiballZ_file.py'
Feb 16 13:06:32 compute-1 sudo[110163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:32 compute-1 python3.9[110165]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:06:32 compute-1 sudo[110163]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:33 compute-1 sudo[110315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjoomxdxgzhnlhswcnwnnehhghikpjqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247193.362483-415-125499158026481/AnsiballZ_command.py'
Feb 16 13:06:33 compute-1 sudo[110315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:33 compute-1 python3.9[110317]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:33 compute-1 sudo[110315]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:34 compute-1 python3.9[110469]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:06:35 compute-1 sudo[110630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euocyjerfgauujocdwvvaqxtdyumizeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247194.8460767-451-9727066215689/AnsiballZ_systemd_service.py'
Feb 16 13:06:35 compute-1 sudo[110630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:35 compute-1 podman[110593]: 2026-02-16 13:06:35.121281903 +0000 UTC m=+0.073266598 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:06:35 compute-1 python3.9[110640]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:06:35 compute-1 systemd[1]: Reloading.
Feb 16 13:06:35 compute-1 systemd-rc-local-generator[110667]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:06:35 compute-1 systemd-sysv-generator[110676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:06:35 compute-1 sudo[110630]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:35 compute-1 sudo[110839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drqnznfrruqxznzvybgsborpjoikezxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247195.7437644-467-274020602488348/AnsiballZ_command.py'
Feb 16 13:06:35 compute-1 sudo[110839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:36 compute-1 python3.9[110841]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:36 compute-1 sudo[110839]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:36 compute-1 sudo[110992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phyalquxrreaatinfhplnmvymxmkcsjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247196.2817276-467-174573407160204/AnsiballZ_command.py'
Feb 16 13:06:36 compute-1 sudo[110992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:36 compute-1 python3.9[110994]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:36 compute-1 sudo[110992]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:37 compute-1 sudo[111145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfbqygrlnynleansctfbickkswyyface ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247196.8322308-467-152289034061914/AnsiballZ_command.py'
Feb 16 13:06:37 compute-1 sudo[111145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:37 compute-1 python3.9[111147]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:37 compute-1 sudo[111145]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:37 compute-1 sudo[111298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irzppiytflvyfimzmyvqdajruzvuxfby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247197.4051986-467-111665584998278/AnsiballZ_command.py'
Feb 16 13:06:37 compute-1 sudo[111298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:37 compute-1 python3.9[111300]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:37 compute-1 sudo[111298]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:38 compute-1 sudo[111451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-steymjmdpbfzxjvpynzmxewvetnvzenm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247197.9684222-467-148606375101137/AnsiballZ_command.py'
Feb 16 13:06:38 compute-1 sudo[111451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:38 compute-1 python3.9[111453]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:38 compute-1 sudo[111451]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:38 compute-1 sudo[111604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqubuxyhxxsnwazwyczjakvlqinsxors ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247198.501518-467-178670609814279/AnsiballZ_command.py'
Feb 16 13:06:38 compute-1 sudo[111604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:38 compute-1 python3.9[111606]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:38 compute-1 sudo[111604]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:39 compute-1 sudo[111757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyxdbigprzmrxsaqxnexghtpnxanowua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247199.1094134-467-14851133832236/AnsiballZ_command.py'
Feb 16 13:06:39 compute-1 sudo[111757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:39 compute-1 python3.9[111759]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:06:39 compute-1 sudo[111757]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:40 compute-1 sudo[111910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftgcmponjgzhmaseknvsebeqxcqwpuzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247200.0181007-575-253834268115572/AnsiballZ_getent.py'
Feb 16 13:06:40 compute-1 sudo[111910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:40 compute-1 python3.9[111912]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 16 13:06:40 compute-1 sudo[111910]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:41 compute-1 sudo[112063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxphsysqgxenkymqehcukvhaqbcgiowo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247200.786125-591-266010215253117/AnsiballZ_group.py'
Feb 16 13:06:41 compute-1 sudo[112063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:41 compute-1 python3.9[112065]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 13:06:41 compute-1 groupadd[112066]: group added to /etc/group: name=libvirt, GID=42473
Feb 16 13:06:41 compute-1 groupadd[112066]: group added to /etc/gshadow: name=libvirt
Feb 16 13:06:41 compute-1 groupadd[112066]: new group: name=libvirt, GID=42473
Feb 16 13:06:41 compute-1 sudo[112063]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:42 compute-1 sudo[112221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjuqeixuhvyaqkvsgmreaurfhxrvtwjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247201.6643167-607-116921197437368/AnsiballZ_user.py'
Feb 16 13:06:42 compute-1 sudo[112221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:42 compute-1 python3.9[112223]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 13:06:42 compute-1 useradd[112225]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 13:06:42 compute-1 sudo[112221]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:43 compute-1 sudo[112381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgbxeezabrwlmkwggxogtfltkansszqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247202.8422208-629-74046989486937/AnsiballZ_setup.py'
Feb 16 13:06:43 compute-1 sudo[112381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:43 compute-1 python3.9[112383]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:06:43 compute-1 sudo[112381]: pam_unix(sudo:session): session closed for user root
Feb 16 13:06:44 compute-1 sudo[112465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnzczmbniurtdmxnzcuummkveenjovto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247202.8422208-629-74046989486937/AnsiballZ_dnf.py'
Feb 16 13:06:44 compute-1 sudo[112465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:06:44 compute-1 python3.9[112467]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:07:02 compute-1 podman[112657]: 2026-02-16 13:07:02.005143911 +0000 UTC m=+0.121481362 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:07:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:07:03.307 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:07:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:07:03.308 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:07:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:07:03.308 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:07:06 compute-1 podman[112677]: 2026-02-16 13:07:06.018799125 +0000 UTC m=+0.135107350 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller)
Feb 16 13:07:09 compute-1 kernel: SELinux:  Converting 2766 SID table entries...
Feb 16 13:07:09 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:07:09 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:07:09 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:07:09 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:07:09 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:07:09 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:07:09 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:07:10 compute-1 sshd-session[112710]: Connection closed by authenticating user root 146.190.226.24 port 47298 [preauth]
Feb 16 13:07:21 compute-1 kernel: SELinux:  Converting 2766 SID table entries...
Feb 16 13:07:21 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:07:21 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:07:21 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:07:21 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:07:21 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:07:21 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:07:21 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:07:32 compute-1 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 16 13:07:32 compute-1 podman[112999]: 2026-02-16 13:07:32.942729158 +0000 UTC m=+0.058668288 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:07:36 compute-1 podman[116997]: 2026-02-16 13:07:36.927811436 +0000 UTC m=+0.068033451 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 13:08:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:08:03.309 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:08:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:08:03.310 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:08:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:08:03.310 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:08:03 compute-1 podman[129654]: 2026-02-16 13:08:03.907289312 +0000 UTC m=+0.051785799 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:08:04 compute-1 kernel: SELinux:  Converting 2767 SID table entries...
Feb 16 13:08:04 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 13:08:04 compute-1 kernel: SELinux:  policy capability open_perms=1
Feb 16 13:08:04 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 13:08:04 compute-1 kernel: SELinux:  policy capability always_check_network=0
Feb 16 13:08:04 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 13:08:04 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 13:08:04 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 13:08:05 compute-1 groupadd[129686]: group added to /etc/group: name=dnsmasq, GID=993
Feb 16 13:08:05 compute-1 groupadd[129686]: group added to /etc/gshadow: name=dnsmasq
Feb 16 13:08:05 compute-1 groupadd[129686]: new group: name=dnsmasq, GID=993
Feb 16 13:08:05 compute-1 useradd[129693]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 16 13:08:05 compute-1 dbus-broker-launch[784]: Noticed file-system modification, trigger reload.
Feb 16 13:08:05 compute-1 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 16 13:08:05 compute-1 dbus-broker-launch[784]: Noticed file-system modification, trigger reload.
Feb 16 13:08:06 compute-1 groupadd[129706]: group added to /etc/group: name=clevis, GID=992
Feb 16 13:08:06 compute-1 groupadd[129706]: group added to /etc/gshadow: name=clevis
Feb 16 13:08:06 compute-1 groupadd[129706]: new group: name=clevis, GID=992
Feb 16 13:08:06 compute-1 useradd[129713]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 16 13:08:06 compute-1 usermod[129723]: add 'clevis' to group 'tss'
Feb 16 13:08:06 compute-1 usermod[129723]: add 'clevis' to shadow group 'tss'
Feb 16 13:08:07 compute-1 podman[129736]: 2026-02-16 13:08:07.321369722 +0000 UTC m=+0.077622701 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:08:08 compute-1 polkitd[44483]: Reloading rules
Feb 16 13:08:08 compute-1 polkitd[44483]: Collecting garbage unconditionally...
Feb 16 13:08:08 compute-1 polkitd[44483]: Loading rules from directory /etc/polkit-1/rules.d
Feb 16 13:08:08 compute-1 polkitd[44483]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 16 13:08:08 compute-1 polkitd[44483]: Finished loading, compiling and executing 3 rules
Feb 16 13:08:08 compute-1 polkitd[44483]: Reloading rules
Feb 16 13:08:08 compute-1 polkitd[44483]: Collecting garbage unconditionally...
Feb 16 13:08:08 compute-1 polkitd[44483]: Loading rules from directory /etc/polkit-1/rules.d
Feb 16 13:08:08 compute-1 polkitd[44483]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 16 13:08:08 compute-1 polkitd[44483]: Finished loading, compiling and executing 3 rules
Feb 16 13:08:09 compute-1 groupadd[129941]: group added to /etc/group: name=ceph, GID=167
Feb 16 13:08:09 compute-1 groupadd[129941]: group added to /etc/gshadow: name=ceph
Feb 16 13:08:09 compute-1 groupadd[129941]: new group: name=ceph, GID=167
Feb 16 13:08:09 compute-1 useradd[129947]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 16 13:08:12 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Feb 16 13:08:12 compute-1 sshd[1017]: Received signal 15; terminating.
Feb 16 13:08:12 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Feb 16 13:08:12 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Feb 16 13:08:12 compute-1 systemd[1]: sshd.service: Consumed 3.553s CPU time, read 32.0K from disk, written 128.0K to disk.
Feb 16 13:08:12 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Feb 16 13:08:12 compute-1 systemd[1]: Stopping sshd-keygen.target...
Feb 16 13:08:12 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 13:08:12 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 13:08:12 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 13:08:12 compute-1 systemd[1]: Reached target sshd-keygen.target.
Feb 16 13:08:12 compute-1 systemd[1]: Starting OpenSSH server daemon...
Feb 16 13:08:12 compute-1 sshd[130466]: Server listening on 0.0.0.0 port 22.
Feb 16 13:08:12 compute-1 sshd[130466]: Server listening on :: port 22.
Feb 16 13:08:12 compute-1 systemd[1]: Started OpenSSH server daemon.
Feb 16 13:08:13 compute-1 sshd-session[130575]: Connection closed by authenticating user root 146.190.226.24 port 59124 [preauth]
Feb 16 13:08:14 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:08:14 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:08:14 compute-1 systemd[1]: Reloading.
Feb 16 13:08:14 compute-1 systemd-rc-local-generator[130723]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:14 compute-1 systemd-sysv-generator[130728]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:14 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:08:16 compute-1 sudo[112465]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:19 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:08:19 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:08:19 compute-1 systemd[1]: man-db-cache-update.service: Consumed 6.773s CPU time.
Feb 16 13:08:19 compute-1 systemd[1]: run-r8137537ec051449c896a28ed99915931.service: Deactivated successfully.
Feb 16 13:08:19 compute-1 sudo[139249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyemeequofkgjlscupauvsgzyyhqocyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247299.3110068-654-13070982903167/AnsiballZ_systemd.py'
Feb 16 13:08:19 compute-1 sudo[139249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:20 compute-1 python3.9[139251]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:20 compute-1 sshd-session[139220]: Invalid user sol from 2.57.122.210 port 53742
Feb 16 13:08:20 compute-1 systemd[1]: Reloading.
Feb 16 13:08:20 compute-1 systemd-rc-local-generator[139275]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:20 compute-1 systemd-sysv-generator[139279]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:20 compute-1 sshd-session[139220]: Connection closed by invalid user sol 2.57.122.210 port 53742 [preauth]
Feb 16 13:08:20 compute-1 sudo[139249]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:20 compute-1 sudo[139445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbhjzdmjaysoqsocutvsbaxqmjmgerfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247300.694575-654-268444208579477/AnsiballZ_systemd.py'
Feb 16 13:08:20 compute-1 sudo[139445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:21 compute-1 python3.9[139447]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:21 compute-1 systemd[1]: Reloading.
Feb 16 13:08:21 compute-1 systemd-rc-local-generator[139478]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:21 compute-1 systemd-sysv-generator[139481]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:21 compute-1 sudo[139445]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:22 compute-1 sudo[139642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffsddhrvijozxvqcixabvexjctvwkigf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247301.7342074-654-36763966332372/AnsiballZ_systemd.py'
Feb 16 13:08:22 compute-1 sudo[139642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:22 compute-1 python3.9[139644]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:22 compute-1 systemd[1]: Reloading.
Feb 16 13:08:22 compute-1 systemd-sysv-generator[139673]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:22 compute-1 systemd-rc-local-generator[139669]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:22 compute-1 sudo[139642]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:23 compute-1 sudo[139839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spnjyxkikddtawdnpdsxdpqebjhzvknz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247302.9578536-654-119736509310503/AnsiballZ_systemd.py'
Feb 16 13:08:23 compute-1 sudo[139839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:23 compute-1 python3.9[139841]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:23 compute-1 systemd[1]: Reloading.
Feb 16 13:08:23 compute-1 systemd-sysv-generator[139873]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:23 compute-1 systemd-rc-local-generator[139868]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:23 compute-1 sudo[139839]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:24 compute-1 sudo[140035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwdfldizejvitgiqtmksccaoommzvkgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247304.0132265-711-224735957761728/AnsiballZ_systemd.py'
Feb 16 13:08:24 compute-1 sudo[140035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:24 compute-1 python3.9[140037]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:24 compute-1 systemd[1]: Reloading.
Feb 16 13:08:24 compute-1 systemd-rc-local-generator[140068]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:24 compute-1 systemd-sysv-generator[140074]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:24 compute-1 sudo[140035]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:25 compute-1 sudo[140231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stpyiatirjudgdvjvklrhlelxafscxmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247305.0541735-711-143352112858324/AnsiballZ_systemd.py'
Feb 16 13:08:25 compute-1 sudo[140231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:25 compute-1 python3.9[140233]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:25 compute-1 systemd[1]: Reloading.
Feb 16 13:08:25 compute-1 systemd-rc-local-generator[140265]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:25 compute-1 systemd-sysv-generator[140268]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:25 compute-1 sudo[140231]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:26 compute-1 sudo[140428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-locessficogqtxtttsdmqkqukvtkqdcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247306.1213152-711-104687367066205/AnsiballZ_systemd.py'
Feb 16 13:08:26 compute-1 sudo[140428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:26 compute-1 python3.9[140430]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:26 compute-1 systemd[1]: Reloading.
Feb 16 13:08:27 compute-1 systemd-rc-local-generator[140460]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:27 compute-1 systemd-sysv-generator[140465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:27 compute-1 sudo[140428]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:27 compute-1 sudo[140625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxzbboevmtkuuxsrlewgywewrzhvjegd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247307.3090522-711-126828967624740/AnsiballZ_systemd.py'
Feb 16 13:08:27 compute-1 sudo[140625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:27 compute-1 python3.9[140627]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:27 compute-1 sudo[140625]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:28 compute-1 sudo[140780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvpnwbgyfjjdrqpzvwueeknrecstvuwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247308.0186586-711-274758895949450/AnsiballZ_systemd.py'
Feb 16 13:08:28 compute-1 sudo[140780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:28 compute-1 python3.9[140782]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:28 compute-1 systemd[1]: Reloading.
Feb 16 13:08:28 compute-1 systemd-sysv-generator[140820]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:28 compute-1 systemd-rc-local-generator[140814]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:28 compute-1 sudo[140780]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:29 compute-1 sudo[140977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zynpmvvuuzwwzhueongutfmbaibwaviy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247309.1558917-783-135361527376185/AnsiballZ_systemd.py'
Feb 16 13:08:29 compute-1 sudo[140977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:29 compute-1 python3.9[140979]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 13:08:29 compute-1 systemd[1]: Reloading.
Feb 16 13:08:29 compute-1 systemd-rc-local-generator[141013]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:08:29 compute-1 systemd-sysv-generator[141017]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:08:30 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Feb 16 13:08:30 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb 16 13:08:30 compute-1 sudo[140977]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:30 compute-1 sudo[141177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvkelssuguidejppvznyyopvvlxxmugo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247310.5470772-799-6781352764235/AnsiballZ_systemd.py'
Feb 16 13:08:30 compute-1 sudo[141177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:31 compute-1 python3.9[141179]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:31 compute-1 sudo[141177]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:31 compute-1 sudo[141332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnznomrshtxxlfnevngjoaoolpujbkgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247311.346001-799-265088644270368/AnsiballZ_systemd.py'
Feb 16 13:08:31 compute-1 sudo[141332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:31 compute-1 python3.9[141334]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:32 compute-1 sudo[141332]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:32 compute-1 sudo[141487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvnstgksvmjxkhyzcaddmbvldohjnfqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247312.1392243-799-248258577248328/AnsiballZ_systemd.py'
Feb 16 13:08:32 compute-1 sudo[141487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:32 compute-1 python3.9[141489]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:32 compute-1 sudo[141487]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:33 compute-1 sudo[141642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjwyuzfpfjmherauffmmemblrqilswaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247312.9002802-799-85768397031522/AnsiballZ_systemd.py'
Feb 16 13:08:33 compute-1 sudo[141642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:33 compute-1 python3.9[141644]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:33 compute-1 sudo[141642]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:33 compute-1 sudo[141808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rogkdtftumzqlnewjvmuofmulhotuwhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247313.6832767-799-105343048590494/AnsiballZ_systemd.py'
Feb 16 13:08:33 compute-1 sudo[141808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:34 compute-1 podman[141771]: 2026-02-16 13:08:34.029215015 +0000 UTC m=+0.085611687 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 13:08:34 compute-1 python3.9[141814]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:34 compute-1 sudo[141808]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:35 compute-1 sudo[141969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xerqrzzqsvxpzlhufnvezwhzvfybpcrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247314.7377138-799-88176924630858/AnsiballZ_systemd.py'
Feb 16 13:08:35 compute-1 sudo[141969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:35 compute-1 python3.9[141971]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:35 compute-1 sudo[141969]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:35 compute-1 sudo[142124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwjwrpalvipqrsnvwqverqhanfkzzjnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247315.4803293-799-51904788562835/AnsiballZ_systemd.py'
Feb 16 13:08:35 compute-1 sudo[142124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:36 compute-1 python3.9[142126]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:36 compute-1 sudo[142124]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:36 compute-1 sudo[142279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldmvitsfsarudhuxnqivjhwlrmakdscs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247316.2463691-799-177232269996591/AnsiballZ_systemd.py'
Feb 16 13:08:36 compute-1 sudo[142279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:36 compute-1 python3.9[142281]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:36 compute-1 sudo[142279]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:37 compute-1 sudo[142434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sykzcaegagiovuvfaejqqyxictjweuop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247317.0119631-799-141983979220194/AnsiballZ_systemd.py'
Feb 16 13:08:37 compute-1 sudo[142434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:37 compute-1 python3.9[142436]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:37 compute-1 sudo[142434]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:37 compute-1 podman[142438]: 2026-02-16 13:08:37.706976261 +0000 UTC m=+0.103489968 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:08:37 compute-1 sudo[142615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejnrmphvgncrtwyxzwhpuylpkhhnvjwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247317.7687097-799-14557302092091/AnsiballZ_systemd.py'
Feb 16 13:08:37 compute-1 sudo[142615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:38 compute-1 python3.9[142617]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:38 compute-1 sudo[142615]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:38 compute-1 sudo[142770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhfcrbvcqjgltczmcllnsbaawbbipuxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247318.4241884-799-220983652954854/AnsiballZ_systemd.py'
Feb 16 13:08:38 compute-1 sudo[142770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:39 compute-1 python3.9[142772]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:39 compute-1 sudo[142770]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:39 compute-1 sudo[142925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pspbsrvrsyvwleqtuhtlknzvoszyvmil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247319.23821-799-263133845852022/AnsiballZ_systemd.py'
Feb 16 13:08:39 compute-1 sudo[142925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:39 compute-1 python3.9[142927]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:39 compute-1 sudo[142925]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:40 compute-1 sudo[143080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srlgjnokensplmjtqbewefokqwrjshms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247320.0530868-799-10134812945761/AnsiballZ_systemd.py'
Feb 16 13:08:40 compute-1 sudo[143080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:40 compute-1 python3.9[143082]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:40 compute-1 sudo[143080]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:41 compute-1 sudo[143235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufpckaqvuwftiondodddlzkxlhtioybf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247320.7802956-799-29593622096749/AnsiballZ_systemd.py'
Feb 16 13:08:41 compute-1 sudo[143235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:41 compute-1 python3.9[143237]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 13:08:41 compute-1 sudo[143235]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:43 compute-1 sudo[143390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtdpqlnfstvonhpzwsqyzkbyjbamqxgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247323.5813422-1003-135028189784068/AnsiballZ_file.py'
Feb 16 13:08:43 compute-1 sudo[143390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:44 compute-1 python3.9[143392]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:44 compute-1 sudo[143390]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:44 compute-1 sudo[143542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhobsxrsnbbexvkkzpgekjjerljucbgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247324.2464368-1003-140637149412693/AnsiballZ_file.py'
Feb 16 13:08:44 compute-1 sudo[143542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:44 compute-1 python3.9[143544]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:44 compute-1 sudo[143542]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:45 compute-1 sudo[143694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ustwfqvsrrsexcsfexqmfpawchhedprb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247324.827125-1003-37142900111821/AnsiballZ_file.py'
Feb 16 13:08:45 compute-1 sudo[143694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:45 compute-1 python3.9[143696]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:45 compute-1 sudo[143694]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:45 compute-1 sudo[143846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxugsprrmgosmbfjehymwdegissfehgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247325.3925836-1003-249206149003496/AnsiballZ_file.py'
Feb 16 13:08:45 compute-1 sudo[143846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:45 compute-1 python3.9[143848]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:45 compute-1 sudo[143846]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:46 compute-1 sudo[143998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqmmribnimqohghdkqsdjrbbzomksdda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247326.004011-1003-140082215372923/AnsiballZ_file.py'
Feb 16 13:08:46 compute-1 sudo[143998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:46 compute-1 python3.9[144000]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:46 compute-1 sudo[143998]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:46 compute-1 sudo[144150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcboufgamfijptfyxqzvgxevcjqgdmzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247326.5709615-1003-203515016538233/AnsiballZ_file.py'
Feb 16 13:08:46 compute-1 sudo[144150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:47 compute-1 python3.9[144152]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:08:47 compute-1 sudo[144150]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:47 compute-1 python3.9[144302]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:08:48 compute-1 sudo[144452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uphwcksszaqdlrllaavswzypyfxqhpco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247328.0269454-1105-190986709202344/AnsiballZ_stat.py'
Feb 16 13:08:48 compute-1 sudo[144452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:48 compute-1 python3.9[144454]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:48 compute-1 sudo[144452]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:49 compute-1 sudo[144577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blxllaqwmygsthniskeflhhbwcdeuvfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247328.0269454-1105-190986709202344/AnsiballZ_copy.py'
Feb 16 13:08:49 compute-1 sudo[144577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:49 compute-1 python3.9[144579]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247328.0269454-1105-190986709202344/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:49 compute-1 sudo[144577]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:49 compute-1 sudo[144729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvkvmqtexpthortaxyegobwheztafmcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247329.5240357-1105-215360094795556/AnsiballZ_stat.py'
Feb 16 13:08:49 compute-1 sudo[144729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:49 compute-1 python3.9[144731]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:50 compute-1 sudo[144729]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:50 compute-1 sudo[144854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymxmeajytycxgoccpkhaqyyzykshwjtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247329.5240357-1105-215360094795556/AnsiballZ_copy.py'
Feb 16 13:08:50 compute-1 sudo[144854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:50 compute-1 python3.9[144856]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247329.5240357-1105-215360094795556/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:50 compute-1 sudo[144854]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:50 compute-1 sudo[145006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sldevteqhkqsqjaecbfijhytxqvbwhjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247330.6896973-1105-185680888316879/AnsiballZ_stat.py'
Feb 16 13:08:50 compute-1 sudo[145006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:51 compute-1 python3.9[145008]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:51 compute-1 sudo[145006]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:51 compute-1 sudo[145131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvwbecbatparriakuqoejscelbwbwxgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247330.6896973-1105-185680888316879/AnsiballZ_copy.py'
Feb 16 13:08:51 compute-1 sudo[145131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:51 compute-1 python3.9[145133]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247330.6896973-1105-185680888316879/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:51 compute-1 sudo[145131]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:52 compute-1 sudo[145283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtlsxhmmvogxeksnpgcpopklyadonaxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247331.8817368-1105-40849795896042/AnsiballZ_stat.py'
Feb 16 13:08:52 compute-1 sudo[145283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:52 compute-1 python3.9[145285]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:52 compute-1 sudo[145283]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:52 compute-1 sudo[145408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebxansurbdfpcscfaggxdooefciyhfxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247331.8817368-1105-40849795896042/AnsiballZ_copy.py'
Feb 16 13:08:52 compute-1 sudo[145408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:52 compute-1 python3.9[145410]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247331.8817368-1105-40849795896042/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:52 compute-1 sudo[145408]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:53 compute-1 sudo[145560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbxefnfvfnqarakvbdttpouepxhuvhgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247332.990745-1105-205767493942787/AnsiballZ_stat.py'
Feb 16 13:08:53 compute-1 sudo[145560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:53 compute-1 python3.9[145562]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:53 compute-1 sudo[145560]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:54 compute-1 sudo[145685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpuxbthpsgrzivlpnbacyqigeeksewwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247332.990745-1105-205767493942787/AnsiballZ_copy.py'
Feb 16 13:08:54 compute-1 sudo[145685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:54 compute-1 python3.9[145687]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247332.990745-1105-205767493942787/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:54 compute-1 sudo[145685]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:54 compute-1 sudo[145837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atinhumjxdnzbevhqveqpucjrenfafvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247334.393094-1105-126834241270996/AnsiballZ_stat.py'
Feb 16 13:08:54 compute-1 sudo[145837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:54 compute-1 python3.9[145839]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:54 compute-1 sudo[145837]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:55 compute-1 sudo[145962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cennnooglftscqsvsosfhnwnkxjugqwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247334.393094-1105-126834241270996/AnsiballZ_copy.py'
Feb 16 13:08:55 compute-1 sudo[145962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:55 compute-1 python3.9[145964]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247334.393094-1105-126834241270996/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:55 compute-1 sudo[145962]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:55 compute-1 sudo[146114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taiysgpcnhvupfbcihigqspgqdbxombv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247335.501538-1105-220754887863492/AnsiballZ_stat.py'
Feb 16 13:08:55 compute-1 sudo[146114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:55 compute-1 python3.9[146116]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:55 compute-1 sudo[146114]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:56 compute-1 sudo[146237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stouiccqoctgdekzkhlfankrjtftsger ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247335.501538-1105-220754887863492/AnsiballZ_copy.py'
Feb 16 13:08:56 compute-1 sudo[146237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:56 compute-1 python3.9[146239]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247335.501538-1105-220754887863492/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:56 compute-1 sudo[146237]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:56 compute-1 sudo[146389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wznehecvaulzefjespfnbrxaklgcjszx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247336.6300838-1105-231415113026684/AnsiballZ_stat.py'
Feb 16 13:08:56 compute-1 sudo[146389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:57 compute-1 python3.9[146391]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:08:57 compute-1 sudo[146389]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:57 compute-1 sudo[146514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbcrljfnrburtgnkavuqswqguyxbwhzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247336.6300838-1105-231415113026684/AnsiballZ_copy.py'
Feb 16 13:08:57 compute-1 sudo[146514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:57 compute-1 python3.9[146516]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771247336.6300838-1105-231415113026684/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:57 compute-1 sudo[146514]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:58 compute-1 sudo[146666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbcpxvllgqslidhsauwvciiczlaweurh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247338.3511024-1331-77313952243267/AnsiballZ_command.py'
Feb 16 13:08:58 compute-1 sudo[146666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:58 compute-1 python3.9[146668]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb 16 13:08:58 compute-1 sudo[146666]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:59 compute-1 sudo[146819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cndgcjimcgusldhdcmlnlxxkkyyjzwju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247339.1190207-1349-64201438672623/AnsiballZ_file.py'
Feb 16 13:08:59 compute-1 sudo[146819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:08:59 compute-1 python3.9[146821]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:08:59 compute-1 sudo[146819]: pam_unix(sudo:session): session closed for user root
Feb 16 13:08:59 compute-1 sudo[146971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdoguolpaazyjyoiwemsxlgkgtacguqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247339.698013-1349-111241161480697/AnsiballZ_file.py'
Feb 16 13:08:59 compute-1 sudo[146971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:00 compute-1 python3.9[146973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:00 compute-1 sudo[146971]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:00 compute-1 sudo[147123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjxskqccwmfdtkttwtyxkiwpommmracf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247340.2659235-1349-118209007899829/AnsiballZ_file.py'
Feb 16 13:09:00 compute-1 sudo[147123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:00 compute-1 python3.9[147125]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:00 compute-1 sudo[147123]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:01 compute-1 sudo[147275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdnztabvlxcreqxwcyhklckfglbtfoul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247340.825624-1349-157726347631401/AnsiballZ_file.py'
Feb 16 13:09:01 compute-1 sudo[147275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:01 compute-1 python3.9[147277]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:01 compute-1 sudo[147275]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:01 compute-1 sudo[147427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exzpdnibmgnfpflyrfdsskeetqyrwpei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247341.4973996-1349-137010659257056/AnsiballZ_file.py'
Feb 16 13:09:01 compute-1 sudo[147427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:01 compute-1 python3.9[147429]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:01 compute-1 sudo[147427]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:02 compute-1 sudo[147579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdfdbubwybvpziqvzwryntpdvnchtnjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247342.0662506-1349-184746885752343/AnsiballZ_file.py'
Feb 16 13:09:02 compute-1 sudo[147579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:02 compute-1 python3.9[147581]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:02 compute-1 sudo[147579]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:02 compute-1 sudo[147731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksbbpirbrslarwdavfjkwagjbwhsojrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247342.6313143-1349-34602730610879/AnsiballZ_file.py'
Feb 16 13:09:02 compute-1 sudo[147731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:03 compute-1 python3.9[147733]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:03 compute-1 sudo[147731]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:09:03.310 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:09:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:09:03.311 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:09:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:09:03.311 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:09:03 compute-1 sudo[147883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpdmcqydffwloqijijvimedoiutfxoaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247343.218207-1349-206413151159174/AnsiballZ_file.py'
Feb 16 13:09:03 compute-1 sudo[147883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:03 compute-1 python3.9[147885]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:03 compute-1 sudo[147883]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:04 compute-1 sudo[148035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rorsgjvjruagnkrflssfsauuyeuqlfcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247343.785996-1349-234016701013078/AnsiballZ_file.py'
Feb 16 13:09:04 compute-1 sudo[148035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:04 compute-1 python3.9[148037]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:04 compute-1 sudo[148035]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:04 compute-1 sudo[148198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cofgdtapejuvfwojnqgbqwpamjtxjlup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247344.3089116-1349-260689284864387/AnsiballZ_file.py'
Feb 16 13:09:04 compute-1 sudo[148198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:04 compute-1 podman[148161]: 2026-02-16 13:09:04.59672128 +0000 UTC m=+0.049242809 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 16 13:09:04 compute-1 python3.9[148206]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:04 compute-1 sudo[148198]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:05 compute-1 sudo[148358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhbaznuzdkiwcilraiznmucktxorvnma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247344.8981748-1349-41892071711311/AnsiballZ_file.py'
Feb 16 13:09:05 compute-1 sudo[148358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:05 compute-1 python3.9[148360]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:05 compute-1 sudo[148358]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:05 compute-1 sudo[148510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjdmuxzhhqvyeictqzxxnozwoszeqjut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247345.426518-1349-105548440237174/AnsiballZ_file.py'
Feb 16 13:09:05 compute-1 sudo[148510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:05 compute-1 python3.9[148512]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:05 compute-1 sudo[148510]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:06 compute-1 sudo[148662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jojmwnlgozdledxcrfjcsxbfyvrammef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247345.9661195-1349-184004861073766/AnsiballZ_file.py'
Feb 16 13:09:06 compute-1 sudo[148662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:06 compute-1 python3.9[148664]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:06 compute-1 sudo[148662]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:06 compute-1 sudo[148814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puhyskibvsomzqcsvotkrsbabyyzuqwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247346.5700283-1349-276903363696295/AnsiballZ_file.py'
Feb 16 13:09:06 compute-1 sudo[148814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:06 compute-1 python3.9[148816]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:06 compute-1 sudo[148814]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:07 compute-1 podman[148841]: 2026-02-16 13:09:07.955122478 +0000 UTC m=+0.098119269 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Feb 16 13:09:08 compute-1 sudo[148992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdqjvwufllnvgnfplbghtplfuhivqqnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247348.063222-1547-52566353878891/AnsiballZ_stat.py'
Feb 16 13:09:08 compute-1 sudo[148992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:08 compute-1 python3.9[148994]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:08 compute-1 sudo[148992]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:08 compute-1 sudo[149115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhngzmubzvwlffyffnayeqzgcgwdhzba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247348.063222-1547-52566353878891/AnsiballZ_copy.py'
Feb 16 13:09:08 compute-1 sudo[149115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:09 compute-1 python3.9[149117]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247348.063222-1547-52566353878891/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:09 compute-1 sudo[149115]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:09 compute-1 sudo[149267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aexrspnciufejbvagnepevceujoprbkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247349.224202-1547-101680666275292/AnsiballZ_stat.py'
Feb 16 13:09:09 compute-1 sudo[149267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:09 compute-1 python3.9[149269]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:09 compute-1 sudo[149267]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:09 compute-1 sudo[149390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehhcksvleyjvrvndapgqtawgbwjykhxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247349.224202-1547-101680666275292/AnsiballZ_copy.py'
Feb 16 13:09:09 compute-1 sudo[149390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:10 compute-1 python3.9[149392]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247349.224202-1547-101680666275292/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:10 compute-1 sudo[149390]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:10 compute-1 sudo[149542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaikmzajwuqdgrvzbaxgebjvrkgnmcbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247350.3097475-1547-213872457668639/AnsiballZ_stat.py'
Feb 16 13:09:10 compute-1 sudo[149542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:10 compute-1 python3.9[149544]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:10 compute-1 sudo[149542]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:11 compute-1 sudo[149665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhltgqwsqbxspcalsmorxhdfsdswjpqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247350.3097475-1547-213872457668639/AnsiballZ_copy.py'
Feb 16 13:09:11 compute-1 sudo[149665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:11 compute-1 python3.9[149667]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247350.3097475-1547-213872457668639/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:11 compute-1 sudo[149665]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:11 compute-1 sudo[149817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pombxzzwfhohljetvnsmzizlumlziskm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247351.4597776-1547-131155861883269/AnsiballZ_stat.py'
Feb 16 13:09:11 compute-1 sudo[149817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:11 compute-1 python3.9[149819]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:11 compute-1 sudo[149817]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:12 compute-1 sudo[149940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wehrqszrbyerpubdxsbseteskkgodyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247351.4597776-1547-131155861883269/AnsiballZ_copy.py'
Feb 16 13:09:12 compute-1 sudo[149940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:12 compute-1 python3.9[149942]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247351.4597776-1547-131155861883269/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:12 compute-1 sudo[149940]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:12 compute-1 sudo[150092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqsmrxocwyqwscpvalbonlejbwijqxex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247352.6208098-1547-221003101494121/AnsiballZ_stat.py'
Feb 16 13:09:12 compute-1 sudo[150092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:13 compute-1 python3.9[150094]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:13 compute-1 sudo[150092]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:13 compute-1 sudo[150215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhxrgamizklsyrborhuewlpobyoltjll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247352.6208098-1547-221003101494121/AnsiballZ_copy.py'
Feb 16 13:09:13 compute-1 sudo[150215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:13 compute-1 python3.9[150217]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247352.6208098-1547-221003101494121/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:13 compute-1 sudo[150215]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:13 compute-1 sudo[150367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxzastrqgpxlcsiskoksjpzryuhjmxzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247353.6875541-1547-68870611634120/AnsiballZ_stat.py'
Feb 16 13:09:13 compute-1 sudo[150367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:14 compute-1 python3.9[150369]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:14 compute-1 sudo[150367]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:14 compute-1 sudo[150490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntigfwxvfcflrnxryqghsljxjishyady ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247353.6875541-1547-68870611634120/AnsiballZ_copy.py'
Feb 16 13:09:14 compute-1 sudo[150490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:14 compute-1 python3.9[150492]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247353.6875541-1547-68870611634120/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:14 compute-1 sudo[150490]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:15 compute-1 sudo[150642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzqqaagdvylcvseltivygpukyhfwenyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247354.8520274-1547-46653479348975/AnsiballZ_stat.py'
Feb 16 13:09:15 compute-1 sudo[150642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:15 compute-1 python3.9[150644]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:15 compute-1 sudo[150642]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:15 compute-1 sudo[150765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eflnpbqnklwgnlrahvijoqysznizeqnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247354.8520274-1547-46653479348975/AnsiballZ_copy.py'
Feb 16 13:09:15 compute-1 sudo[150765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:15 compute-1 python3.9[150767]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247354.8520274-1547-46653479348975/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:15 compute-1 sudo[150765]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:16 compute-1 sudo[150917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byfgzdfwvwknoseqawftlzqkylmdaskr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247356.4691293-1547-250703812968719/AnsiballZ_stat.py'
Feb 16 13:09:16 compute-1 sudo[150917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:16 compute-1 python3.9[150919]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:16 compute-1 sudo[150917]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:17 compute-1 sudo[151040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exgacfiszcdkxrrlbruggpjzrbdtvarm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247356.4691293-1547-250703812968719/AnsiballZ_copy.py'
Feb 16 13:09:17 compute-1 sudo[151040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:17 compute-1 python3.9[151042]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247356.4691293-1547-250703812968719/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:17 compute-1 sudo[151040]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:17 compute-1 sudo[151192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgefdbecxcdtrrxtllzcqynvquoxpgwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247357.5916698-1547-227703644121049/AnsiballZ_stat.py'
Feb 16 13:09:17 compute-1 sudo[151192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:18 compute-1 python3.9[151194]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:18 compute-1 sudo[151192]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:18 compute-1 sudo[151315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztrgictgcwpegnckgplicqyhpdeekwri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247357.5916698-1547-227703644121049/AnsiballZ_copy.py'
Feb 16 13:09:18 compute-1 sudo[151315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:18 compute-1 python3.9[151317]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247357.5916698-1547-227703644121049/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:18 compute-1 sudo[151315]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:19 compute-1 sudo[151467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arjunyydyzmfybvldppndkigwnbsabqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247358.7448642-1547-191094910843732/AnsiballZ_stat.py'
Feb 16 13:09:19 compute-1 sudo[151467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:19 compute-1 python3.9[151469]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:19 compute-1 sudo[151467]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:19 compute-1 sudo[151590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljydctpkrokfpjdojfqgpnlfcbvvowtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247358.7448642-1547-191094910843732/AnsiballZ_copy.py'
Feb 16 13:09:19 compute-1 sudo[151590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:19 compute-1 python3.9[151592]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247358.7448642-1547-191094910843732/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:19 compute-1 sudo[151590]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:20 compute-1 sudo[151742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btaseagefxyfqyoxjasgybzjmxknjfgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247359.92695-1547-56927921608224/AnsiballZ_stat.py'
Feb 16 13:09:20 compute-1 sudo[151742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:20 compute-1 python3.9[151744]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:20 compute-1 sudo[151742]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:20 compute-1 sudo[151865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlwdoilaurqdrstqvuqytirgsqzscbaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247359.92695-1547-56927921608224/AnsiballZ_copy.py'
Feb 16 13:09:20 compute-1 sudo[151865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:21 compute-1 python3.9[151867]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247359.92695-1547-56927921608224/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:21 compute-1 sudo[151865]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:21 compute-1 sudo[152019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcryoykbgimileytyvatcsfppgrvukzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247361.266811-1547-147517412847351/AnsiballZ_stat.py'
Feb 16 13:09:21 compute-1 sudo[152019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:21 compute-1 python3.9[152021]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:21 compute-1 sudo[152019]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:21 compute-1 sshd-session[151990]: Connection closed by authenticating user root 146.190.226.24 port 53924 [preauth]
Feb 16 13:09:22 compute-1 sudo[152142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgfsyfavlumzionpshixldcypwoxjbef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247361.266811-1547-147517412847351/AnsiballZ_copy.py'
Feb 16 13:09:22 compute-1 sudo[152142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:22 compute-1 python3.9[152144]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247361.266811-1547-147517412847351/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:22 compute-1 sudo[152142]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:22 compute-1 sudo[152294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjqqgaqbjvyhcvvgcnlhzlppqfqbjrbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247362.4409752-1547-205920598440513/AnsiballZ_stat.py'
Feb 16 13:09:22 compute-1 sudo[152294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:22 compute-1 python3.9[152296]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:22 compute-1 sudo[152294]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:23 compute-1 sudo[152417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzrkzpfindpqbppeoeopngimwiefawew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247362.4409752-1547-205920598440513/AnsiballZ_copy.py'
Feb 16 13:09:23 compute-1 sudo[152417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:23 compute-1 python3.9[152419]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247362.4409752-1547-205920598440513/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:23 compute-1 sudo[152417]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:23 compute-1 sudo[152569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktuqtzsmmelbuzkmwtipplwgearqudaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247363.562197-1547-144775087287601/AnsiballZ_stat.py'
Feb 16 13:09:23 compute-1 sudo[152569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:24 compute-1 python3.9[152571]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:24 compute-1 sudo[152569]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:24 compute-1 sudo[152692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvccsjtwkhlkfgufehsxoqvqvfzwxhbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247363.562197-1547-144775087287601/AnsiballZ_copy.py'
Feb 16 13:09:24 compute-1 sudo[152692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:24 compute-1 python3.9[152694]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247363.562197-1547-144775087287601/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:24 compute-1 sudo[152692]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:25 compute-1 python3.9[152844]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:26 compute-1 sudo[152997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arypseklevzrgwhdiokyasqvhjcoosvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247365.7335627-1959-50066126027789/AnsiballZ_seboolean.py'
Feb 16 13:09:26 compute-1 sudo[152997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:26 compute-1 python3.9[152999]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 16 13:09:27 compute-1 sudo[152997]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:27 compute-1 sudo[153153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgotpgqrmttpgsctqktuhjlpfxqmyctx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247367.5810547-1975-274477404045990/AnsiballZ_copy.py'
Feb 16 13:09:27 compute-1 dbus-broker-launch[792]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 16 13:09:27 compute-1 sudo[153153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:28 compute-1 python3.9[153155]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:28 compute-1 sudo[153153]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:28 compute-1 sudo[153305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phbgcjcebpauvegmgiuakpukiwddnmea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247368.2917395-1975-14787199790299/AnsiballZ_copy.py'
Feb 16 13:09:28 compute-1 sudo[153305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:28 compute-1 python3.9[153307]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:28 compute-1 sudo[153305]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:29 compute-1 sudo[153457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-horxbyuzwkhigmhcbvvpiztxkfzuqhil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247368.922134-1975-33048454111202/AnsiballZ_copy.py'
Feb 16 13:09:29 compute-1 sudo[153457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:29 compute-1 python3.9[153459]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:29 compute-1 sudo[153457]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:29 compute-1 sudo[153609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tssfhefvjbkpxgfujbicqhcytldtblem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247369.5854933-1975-280618949156895/AnsiballZ_copy.py'
Feb 16 13:09:29 compute-1 sudo[153609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:30 compute-1 python3.9[153611]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:30 compute-1 sudo[153609]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:30 compute-1 sudo[153761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spyeziqyvabtiaadosbhjqqqgcfnzune ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247370.1782155-1975-56785601327769/AnsiballZ_copy.py'
Feb 16 13:09:30 compute-1 sudo[153761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:30 compute-1 python3.9[153763]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:30 compute-1 sudo[153761]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:31 compute-1 sudo[153913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktqsudoveeykezqdjkmpddxailgmvjen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247370.8516352-2047-266298650209460/AnsiballZ_copy.py'
Feb 16 13:09:31 compute-1 sudo[153913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:31 compute-1 python3.9[153915]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:31 compute-1 sudo[153913]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:31 compute-1 sudo[154065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usucvaxleddtnxgsvtswreiudroajkqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247371.4719114-2047-66326098209504/AnsiballZ_copy.py'
Feb 16 13:09:31 compute-1 sudo[154065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:31 compute-1 python3.9[154067]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:31 compute-1 sudo[154065]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:32 compute-1 sudo[154217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uewtlebfpzewbejcyoigbnqkwmsditro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247371.9969506-2047-33557318449587/AnsiballZ_copy.py'
Feb 16 13:09:32 compute-1 sudo[154217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:32 compute-1 python3.9[154219]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:32 compute-1 sudo[154217]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:32 compute-1 sudo[154369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgztamvxpuftpaoqrezwzxxprcsibvkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247372.5763106-2047-12937742976649/AnsiballZ_copy.py'
Feb 16 13:09:32 compute-1 sudo[154369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:33 compute-1 python3.9[154371]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:33 compute-1 sudo[154369]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:33 compute-1 sudo[154521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xctmolcuahlirinufqlcwynzzbxzwjtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247373.2081933-2047-49170665808881/AnsiballZ_copy.py'
Feb 16 13:09:33 compute-1 sudo[154521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:33 compute-1 python3.9[154523]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:33 compute-1 sudo[154521]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:34 compute-1 sudo[154673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yavynxykruzszbgtyupcjuokiopsmxde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247374.164731-2119-168608151719762/AnsiballZ_systemd.py'
Feb 16 13:09:34 compute-1 sudo[154673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:34 compute-1 python3.9[154675]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:34 compute-1 systemd[1]: Reloading.
Feb 16 13:09:34 compute-1 systemd-rc-local-generator[154718]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:34 compute-1 systemd-sysv-generator[154722]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:34 compute-1 podman[154677]: 2026-02-16 13:09:34.919720687 +0000 UTC m=+0.108335201 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:09:35 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Feb 16 13:09:35 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Feb 16 13:09:35 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Feb 16 13:09:35 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 16 13:09:35 compute-1 systemd[1]: Starting libvirt logging daemon...
Feb 16 13:09:35 compute-1 systemd[1]: Started libvirt logging daemon.
Feb 16 13:09:35 compute-1 sudo[154673]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:35 compute-1 sudo[154892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmmjojfenkupgbblpnfwpgnxofnsfykh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247375.3495798-2119-201516545871462/AnsiballZ_systemd.py'
Feb 16 13:09:35 compute-1 sudo[154892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:35 compute-1 python3.9[154894]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:35 compute-1 systemd[1]: Reloading.
Feb 16 13:09:36 compute-1 systemd-rc-local-generator[154918]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:36 compute-1 systemd-sysv-generator[154922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:36 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Feb 16 13:09:36 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 16 13:09:36 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 16 13:09:36 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 16 13:09:36 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 16 13:09:36 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 16 13:09:36 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Feb 16 13:09:36 compute-1 systemd[1]: Started libvirt nodedev daemon.
Feb 16 13:09:36 compute-1 sudo[154892]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:36 compute-1 sudo[155114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuhrjebqyozepdunowqsnrlhztlcgkru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247376.4387958-2119-152957875021615/AnsiballZ_systemd.py'
Feb 16 13:09:36 compute-1 sudo[155114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:36 compute-1 python3.9[155116]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:37 compute-1 systemd[1]: Reloading.
Feb 16 13:09:37 compute-1 systemd-sysv-generator[155146]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:37 compute-1 systemd-rc-local-generator[155143]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:37 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 16 13:09:37 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 16 13:09:37 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 16 13:09:37 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 16 13:09:37 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 16 13:09:37 compute-1 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:09:37 compute-1 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:09:37 compute-1 sudo[155114]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:37 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 16 13:09:37 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 16 13:09:37 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 16 13:09:37 compute-1 sudo[155339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsikbffchsbedpfvwtbfipmmuerxuwqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247377.5026705-2119-27034650668410/AnsiballZ_systemd.py'
Feb 16 13:09:37 compute-1 sudo[155339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:38 compute-1 python3.9[155341]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:38 compute-1 systemd[1]: Reloading.
Feb 16 13:09:38 compute-1 systemd-sysv-generator[155399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:38 compute-1 systemd-rc-local-generator[155396]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:38 compute-1 podman[155345]: 2026-02-16 13:09:38.222177412 +0000 UTC m=+0.109368958 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:09:38 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Feb 16 13:09:38 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Feb 16 13:09:38 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 16 13:09:38 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 16 13:09:38 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 16 13:09:38 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 16 13:09:38 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 16 13:09:38 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 16 13:09:38 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 16 13:09:38 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 16 13:09:38 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Feb 16 13:09:38 compute-1 systemd[1]: Started libvirt QEMU daemon.
Feb 16 13:09:38 compute-1 sudo[155339]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:38 compute-1 setroubleshoot[155159]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 565a7557-eee1-45d8-8c0b-6f64f60f2aed
Feb 16 13:09:38 compute-1 setroubleshoot[155159]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 16 13:09:38 compute-1 setroubleshoot[155159]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 565a7557-eee1-45d8-8c0b-6f64f60f2aed
Feb 16 13:09:38 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:09:38 compute-1 setroubleshoot[155159]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 16 13:09:38 compute-1 sudo[155594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbnjucvwzsxrykakqfgoibuecsnsehbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247378.5717063-2119-186701784047568/AnsiballZ_systemd.py'
Feb 16 13:09:38 compute-1 sudo[155594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:39 compute-1 python3.9[155596]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:09:39 compute-1 systemd[1]: Reloading.
Feb 16 13:09:39 compute-1 systemd-rc-local-generator[155620]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:09:39 compute-1 systemd-sysv-generator[155624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:09:39 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Feb 16 13:09:39 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Feb 16 13:09:39 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Feb 16 13:09:39 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 16 13:09:39 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 16 13:09:39 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 16 13:09:39 compute-1 systemd[1]: Starting libvirt secret daemon...
Feb 16 13:09:39 compute-1 systemd[1]: Started libvirt secret daemon.
Feb 16 13:09:39 compute-1 sudo[155594]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:40 compute-1 sudo[155813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzdjdkwmftdrlwvmscquxwrdjajffqpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247379.8851125-2193-21775267134621/AnsiballZ_file.py'
Feb 16 13:09:40 compute-1 sudo[155813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:40 compute-1 python3.9[155815]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:40 compute-1 sudo[155813]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:40 compute-1 sudo[155965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tklxjriimtussvpzybfnintvrmxqxeim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247380.628855-2209-171581541256678/AnsiballZ_find.py'
Feb 16 13:09:40 compute-1 sudo[155965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:41 compute-1 python3.9[155967]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:09:41 compute-1 sudo[155965]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:41 compute-1 sudo[156117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuprhvskxytkxujzrmghotavlquzmgpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247381.6129138-2237-25239490770334/AnsiballZ_stat.py'
Feb 16 13:09:41 compute-1 sudo[156117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:42 compute-1 python3.9[156119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:42 compute-1 sudo[156117]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:42 compute-1 sudo[156240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzmpvfqlcgbzwlensjxkcghripqepsff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247381.6129138-2237-25239490770334/AnsiballZ_copy.py'
Feb 16 13:09:42 compute-1 sudo[156240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:42 compute-1 python3.9[156242]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247381.6129138-2237-25239490770334/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:42 compute-1 sudo[156240]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:43 compute-1 sudo[156392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqfeauiqemprtorvrcimcqnlonppwhds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247383.0286188-2269-165304912139716/AnsiballZ_file.py'
Feb 16 13:09:43 compute-1 sudo[156392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:43 compute-1 python3.9[156394]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:43 compute-1 sudo[156392]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:44 compute-1 sudo[156544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiqjdwfyheeshgirxkwnfvfkdbyrvumm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247383.7447-2285-230571847176466/AnsiballZ_stat.py'
Feb 16 13:09:44 compute-1 sudo[156544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:44 compute-1 python3.9[156546]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:44 compute-1 sudo[156544]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:44 compute-1 sudo[156622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhdujbmxmoowvmghtykslfcqlzvcyujp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247383.7447-2285-230571847176466/AnsiballZ_file.py'
Feb 16 13:09:44 compute-1 sudo[156622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:44 compute-1 python3.9[156624]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:44 compute-1 sudo[156622]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:45 compute-1 sudo[156774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueepzolitulrzooxxalmedfbyxwjjkia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247384.9276867-2309-249412216897952/AnsiballZ_stat.py'
Feb 16 13:09:45 compute-1 sudo[156774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:45 compute-1 python3.9[156776]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:45 compute-1 sudo[156774]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:45 compute-1 sudo[156852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubrvplntbrobycurwkhbfhicvpwnqmba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247384.9276867-2309-249412216897952/AnsiballZ_file.py'
Feb 16 13:09:45 compute-1 sudo[156852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:45 compute-1 python3.9[156854]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.cx8x4z8f recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:45 compute-1 sudo[156852]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:46 compute-1 sudo[157004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eipumattqfzwzgyjuxvttspgeimjcxdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247386.087476-2333-181873275575583/AnsiballZ_stat.py'
Feb 16 13:09:46 compute-1 sudo[157004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:46 compute-1 python3.9[157006]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:46 compute-1 sudo[157004]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:46 compute-1 sudo[157082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvrfpkatvkwekhkjrbnwzuokuevjkphl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247386.087476-2333-181873275575583/AnsiballZ_file.py'
Feb 16 13:09:46 compute-1 sudo[157082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:46 compute-1 python3.9[157084]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:46 compute-1 sudo[157082]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:47 compute-1 sudo[157234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztxfnkowvwycavclntkuoxyewrwgerjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247387.3135931-2359-117296234289365/AnsiballZ_command.py'
Feb 16 13:09:47 compute-1 sudo[157234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:47 compute-1 python3.9[157236]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:47 compute-1 sudo[157234]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:48 compute-1 sudo[157387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azfnzvxgqkrtxinmpeuxyowejrisawet ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247388.0394554-2375-28543823515949/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 13:09:48 compute-1 sudo[157387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:48 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 16 13:09:48 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 16 13:09:48 compute-1 python3[157389]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 13:09:48 compute-1 sudo[157387]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:49 compute-1 sudo[157539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jssbodnpuwhyjhfyeitibxevmudrxbiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247389.1315372-2391-30947580475579/AnsiballZ_stat.py'
Feb 16 13:09:49 compute-1 sudo[157539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:49 compute-1 python3.9[157541]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:49 compute-1 sudo[157539]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:49 compute-1 sudo[157617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlwkskcvqynrmqgtwnlcprygubocrksg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247389.1315372-2391-30947580475579/AnsiballZ_file.py'
Feb 16 13:09:49 compute-1 sudo[157617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:50 compute-1 python3.9[157619]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:50 compute-1 sudo[157617]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:50 compute-1 sudo[157769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjhetqptnplyxzblmijdyxeneaagkqjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247390.322721-2415-10624249007361/AnsiballZ_stat.py'
Feb 16 13:09:50 compute-1 sudo[157769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:50 compute-1 python3.9[157771]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:50 compute-1 sudo[157769]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:51 compute-1 sudo[157894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqieuzasybdpghctjmgccdcczpysbdxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247390.322721-2415-10624249007361/AnsiballZ_copy.py'
Feb 16 13:09:51 compute-1 sudo[157894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:51 compute-1 python3.9[157896]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247390.322721-2415-10624249007361/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:51 compute-1 sudo[157894]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:51 compute-1 sudo[158046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfkyofiqzkktutbdeplvmihxhnjlqrne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247391.5635707-2445-172213340057459/AnsiballZ_stat.py'
Feb 16 13:09:51 compute-1 sudo[158046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:51 compute-1 python3.9[158048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:52 compute-1 sudo[158046]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:52 compute-1 sudo[158124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwphwblybgrpydfabdvagcuhxurggdtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247391.5635707-2445-172213340057459/AnsiballZ_file.py'
Feb 16 13:09:52 compute-1 sudo[158124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:52 compute-1 python3.9[158126]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:52 compute-1 sudo[158124]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:52 compute-1 sudo[158276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtbpecgphhvtqnfzamqkuqwsyhmovaxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247392.7017047-2469-88202347423571/AnsiballZ_stat.py'
Feb 16 13:09:52 compute-1 sudo[158276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:53 compute-1 python3.9[158278]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:53 compute-1 sudo[158276]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:53 compute-1 sudo[158354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbwehfffhlgamwuxxabcdkwojrjzjvto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247392.7017047-2469-88202347423571/AnsiballZ_file.py'
Feb 16 13:09:53 compute-1 sudo[158354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:53 compute-1 python3.9[158356]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:53 compute-1 sudo[158354]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:54 compute-1 sudo[158506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iercratxiizgxxvsdmebexalczmrztmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247393.8298562-2493-95781771785198/AnsiballZ_stat.py'
Feb 16 13:09:54 compute-1 sudo[158506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:54 compute-1 python3.9[158508]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:09:54 compute-1 sudo[158506]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:54 compute-1 sudo[158631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfztbnqtkujsaexmalqfufoqysswjlax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247393.8298562-2493-95781771785198/AnsiballZ_copy.py'
Feb 16 13:09:54 compute-1 sudo[158631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:54 compute-1 python3.9[158633]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247393.8298562-2493-95781771785198/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:55 compute-1 sudo[158631]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:55 compute-1 sudo[158783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cazhuwjkyueslptzqteaiqyenwlgwfuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247395.1909204-2523-190348231290834/AnsiballZ_file.py'
Feb 16 13:09:55 compute-1 sudo[158783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:55 compute-1 python3.9[158785]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:55 compute-1 sudo[158783]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:56 compute-1 sudo[158935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flsnmlkhszmeppxfnhzacngfwgmvoyyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247396.0013356-2539-89322827469041/AnsiballZ_command.py'
Feb 16 13:09:56 compute-1 sudo[158935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:56 compute-1 python3.9[158937]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:56 compute-1 sudo[158935]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:57 compute-1 sudo[159090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gryigtttcabvfbystipvoyfiwrrkhujp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247396.7399561-2555-46112527736300/AnsiballZ_blockinfile.py'
Feb 16 13:09:57 compute-1 sudo[159090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:57 compute-1 python3.9[159092]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:09:57 compute-1 sudo[159090]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:58 compute-1 sudo[159242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgfzkyijqdfhyehbryesahayusfnzlex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247398.0860174-2573-273999120439917/AnsiballZ_command.py'
Feb 16 13:09:58 compute-1 sudo[159242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:58 compute-1 python3.9[159244]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:58 compute-1 sudo[159242]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:59 compute-1 sudo[159395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jukeaxxbewcexsbsemtovhfeovparfxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247398.7928677-2589-261972487366049/AnsiballZ_stat.py'
Feb 16 13:09:59 compute-1 sudo[159395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:59 compute-1 python3.9[159397]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:09:59 compute-1 sudo[159395]: pam_unix(sudo:session): session closed for user root
Feb 16 13:09:59 compute-1 sudo[159549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftckdyrwexfqbqdwhshbbyhlexwosrss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247399.4688814-2605-174813614569163/AnsiballZ_command.py'
Feb 16 13:09:59 compute-1 sudo[159549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:09:59 compute-1 python3.9[159551]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:09:59 compute-1 sudo[159549]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:00 compute-1 sudo[159704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odihmvzhaftphtbbvsubrawlqaszgzqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247400.2064307-2621-260385489363340/AnsiballZ_file.py'
Feb 16 13:10:00 compute-1 sudo[159704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:00 compute-1 python3.9[159706]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:00 compute-1 sudo[159704]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:01 compute-1 sudo[159856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amcvystiptobzpbdeobuncagsigvlmcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247401.0264313-2637-164905784496222/AnsiballZ_stat.py'
Feb 16 13:10:01 compute-1 sudo[159856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:01 compute-1 python3.9[159858]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:01 compute-1 sudo[159856]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:02 compute-1 sudo[159979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpwcjqdmuzqgfclrhkkzoaowwsdckadq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247401.0264313-2637-164905784496222/AnsiballZ_copy.py'
Feb 16 13:10:02 compute-1 sudo[159979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:02 compute-1 python3.9[159981]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247401.0264313-2637-164905784496222/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:02 compute-1 sudo[159979]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:02 compute-1 sudo[160131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quhidlcgawodrqyvilzlnhmtnrfiahav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247402.5263126-2667-10023661616729/AnsiballZ_stat.py'
Feb 16 13:10:02 compute-1 sudo[160131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:02 compute-1 python3.9[160133]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:03 compute-1 sudo[160131]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:10:03.311 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:10:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:10:03.313 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:10:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:10:03.313 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:10:03 compute-1 sudo[160254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmbokfjomuwvliqlxnpcyygygxeiyjvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247402.5263126-2667-10023661616729/AnsiballZ_copy.py'
Feb 16 13:10:03 compute-1 sudo[160254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:03 compute-1 python3.9[160256]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247402.5263126-2667-10023661616729/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:03 compute-1 sudo[160254]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:04 compute-1 sudo[160406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vayivhrybpchgmgzkeqzlypckctqnspt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247403.8465543-2697-203533469709538/AnsiballZ_stat.py'
Feb 16 13:10:04 compute-1 sudo[160406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:04 compute-1 python3.9[160408]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:04 compute-1 sudo[160406]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:04 compute-1 sudo[160529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfhuxhgdjlwpxreoclkoimqklsxbcgpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247403.8465543-2697-203533469709538/AnsiballZ_copy.py'
Feb 16 13:10:04 compute-1 sudo[160529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:04 compute-1 python3.9[160531]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247403.8465543-2697-203533469709538/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:04 compute-1 sudo[160529]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:05 compute-1 sudo[160693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esjfvaxhztixdlhrmapconxcsoqrgnoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247405.2461255-2727-2511239759055/AnsiballZ_systemd.py'
Feb 16 13:10:05 compute-1 sudo[160693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:05 compute-1 podman[160655]: 2026-02-16 13:10:05.728937182 +0000 UTC m=+0.062950529 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 16 13:10:06 compute-1 python3.9[160699]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:10:06 compute-1 systemd[1]: Reloading.
Feb 16 13:10:06 compute-1 systemd-rc-local-generator[160729]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:06 compute-1 systemd-sysv-generator[160732]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:06 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Feb 16 13:10:06 compute-1 sudo[160693]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:06 compute-1 sudo[160897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frbuettplcwumdomdktpeacaqtgzkhpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247406.5463502-2743-131160011721030/AnsiballZ_systemd.py'
Feb 16 13:10:06 compute-1 sudo[160897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:07 compute-1 python3.9[160899]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 16 13:10:07 compute-1 systemd[1]: Reloading.
Feb 16 13:10:07 compute-1 systemd-rc-local-generator[160926]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:07 compute-1 systemd-sysv-generator[160930]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:07 compute-1 systemd[1]: Reloading.
Feb 16 13:10:07 compute-1 systemd-sysv-generator[160968]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:07 compute-1 systemd-rc-local-generator[160965]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:07 compute-1 sudo[160897]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:08 compute-1 sshd-session[106155]: Connection closed by 192.168.122.30 port 59316
Feb 16 13:10:08 compute-1 sshd-session[106152]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:10:08 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Feb 16 13:10:08 compute-1 systemd[1]: session-24.scope: Consumed 2min 57.057s CPU time.
Feb 16 13:10:08 compute-1 systemd-logind[821]: Session 24 logged out. Waiting for processes to exit.
Feb 16 13:10:08 compute-1 systemd-logind[821]: Removed session 24.
Feb 16 13:10:08 compute-1 podman[161009]: 2026-02-16 13:10:08.983093384 +0000 UTC m=+0.086114168 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 13:10:14 compute-1 sshd-session[161035]: Accepted publickey for zuul from 192.168.122.30 port 36642 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:10:14 compute-1 systemd-logind[821]: New session 25 of user zuul.
Feb 16 13:10:14 compute-1 systemd[1]: Started Session 25 of User zuul.
Feb 16 13:10:14 compute-1 sshd-session[161035]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:10:15 compute-1 python3.9[161188]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:10:16 compute-1 python3.9[161342]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:10:16 compute-1 network[161359]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:10:16 compute-1 network[161360]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:10:16 compute-1 network[161361]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:10:21 compute-1 sudo[161631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eundlpvsmkolymkcspedknyltlgkiypb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247420.9576178-75-168780224260130/AnsiballZ_setup.py'
Feb 16 13:10:21 compute-1 sudo[161631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:21 compute-1 python3.9[161633]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 13:10:21 compute-1 sudo[161631]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:22 compute-1 sudo[161715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwjrdbcszpjoypgaztuuodjiwpyjcvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247420.9576178-75-168780224260130/AnsiballZ_dnf.py'
Feb 16 13:10:22 compute-1 sudo[161715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:22 compute-1 python3.9[161717]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:10:28 compute-1 sudo[161715]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:28 compute-1 sudo[161870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivizmtpcfjivbluxcbcfkmhljqvbrusu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247428.457314-99-177676428285458/AnsiballZ_stat.py'
Feb 16 13:10:28 compute-1 sudo[161870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:28 compute-1 sshd-session[161719]: Connection closed by authenticating user root 146.190.226.24 port 54118 [preauth]
Feb 16 13:10:29 compute-1 python3.9[161872]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:10:29 compute-1 sudo[161870]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:29 compute-1 sudo[162022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdhnyaufjteadzhczwmadyzmffsdsyag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247429.4147499-120-36582101223869/AnsiballZ_command.py'
Feb 16 13:10:29 compute-1 sudo[162022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:30 compute-1 python3.9[162024]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:10:30 compute-1 sudo[162022]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:30 compute-1 sudo[162175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktqozeuczyradeutsyvyvncnmbiswxwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247430.5270789-139-249793701573580/AnsiballZ_stat.py'
Feb 16 13:10:30 compute-1 sudo[162175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:31 compute-1 python3.9[162177]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:10:31 compute-1 sudo[162175]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:31 compute-1 sudo[162329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ingaeqimviqekmsymtbiowsqapeojqea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247431.2479334-155-238188174196553/AnsiballZ_command.py'
Feb 16 13:10:31 compute-1 sudo[162329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:31 compute-1 python3.9[162331]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:10:31 compute-1 sshd-session[162254]: Invalid user sol from 2.57.122.210 port 56432
Feb 16 13:10:31 compute-1 sudo[162329]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:31 compute-1 sshd-session[162254]: Connection closed by invalid user sol 2.57.122.210 port 56432 [preauth]
Feb 16 13:10:32 compute-1 sudo[162482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfsghxptjntmgfqpgmlbsxqaohgmscii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247431.9727235-171-149447794586504/AnsiballZ_stat.py'
Feb 16 13:10:32 compute-1 sudo[162482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:32 compute-1 python3.9[162484]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:32 compute-1 sudo[162482]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:32 compute-1 sudo[162605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzxjykoqtvazlosexcfkqccqdxzulcfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247431.9727235-171-149447794586504/AnsiballZ_copy.py'
Feb 16 13:10:32 compute-1 sudo[162605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:33 compute-1 python3.9[162607]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247431.9727235-171-149447794586504/.source.iscsi _original_basename=.jgrkmg2y follow=False checksum=979caf38040a1861331098d72cfd79c018657cb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:33 compute-1 sudo[162605]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:34 compute-1 sudo[162757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wufbtvkornhpxcbcsowhjvxfefvzuxhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247433.3159955-201-156703262145158/AnsiballZ_file.py'
Feb 16 13:10:34 compute-1 sudo[162757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:34 compute-1 python3.9[162759]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:34 compute-1 sudo[162757]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:35 compute-1 sudo[162909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjielchplxmxdomhsakwudoborsnbubk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247434.4953127-217-115923386043260/AnsiballZ_lineinfile.py'
Feb 16 13:10:35 compute-1 sudo[162909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:35 compute-1 python3.9[162911]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:35 compute-1 sudo[162909]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:35 compute-1 podman[162988]: 2026-02-16 13:10:35.92216363 +0000 UTC m=+0.062782588 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:10:36 compute-1 sudo[163081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbyrciwhznerhhjefswwbicnihwrbxea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247435.6395504-235-100360576959570/AnsiballZ_systemd_service.py'
Feb 16 13:10:36 compute-1 sudo[163081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:36 compute-1 python3.9[163083]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:10:36 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 16 13:10:36 compute-1 sudo[163081]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:37 compute-1 sudo[163237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfbvaxeljedidosiiwatfyqazohprzog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247436.9427552-251-268932194508662/AnsiballZ_systemd_service.py'
Feb 16 13:10:37 compute-1 sudo[163237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:37 compute-1 python3.9[163239]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:10:37 compute-1 systemd[1]: Reloading.
Feb 16 13:10:37 compute-1 systemd-sysv-generator[163266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:37 compute-1 systemd-rc-local-generator[163260]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:37 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 16 13:10:37 compute-1 systemd[1]: Starting Open-iSCSI...
Feb 16 13:10:37 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Feb 16 13:10:37 compute-1 systemd[1]: Started Open-iSCSI.
Feb 16 13:10:37 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 16 13:10:37 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 16 13:10:37 compute-1 sudo[163237]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:38 compute-1 python3.9[163445]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:10:38 compute-1 network[163462]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:10:38 compute-1 network[163463]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:10:38 compute-1 network[163464]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:10:39 compute-1 podman[163470]: 2026-02-16 13:10:39.625019428 +0000 UTC m=+0.094785843 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 16 13:10:42 compute-1 sudo[163758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdbxvbfuxfkkrqflibqvshjbvtjrjejr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247442.727066-297-35213770838689/AnsiballZ_dnf.py'
Feb 16 13:10:42 compute-1 sudo[163758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:43 compute-1 python3.9[163760]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:10:45 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:10:45 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:10:45 compute-1 systemd[1]: Reloading.
Feb 16 13:10:45 compute-1 systemd-rc-local-generator[163807]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:10:45 compute-1 systemd-sysv-generator[163812]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:10:45 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:10:45 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:10:45 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:10:45 compute-1 systemd[1]: run-rfae62ed504a64a9ab437559a20979efd.service: Deactivated successfully.
Feb 16 13:10:46 compute-1 sudo[163758]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:46 compute-1 sudo[164080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkamclidgsoacityzorwguoijfyvuvar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247446.3199842-315-226083709627739/AnsiballZ_file.py'
Feb 16 13:10:46 compute-1 sudo[164080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:46 compute-1 python3.9[164082]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 16 13:10:46 compute-1 sudo[164080]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:47 compute-1 sudo[164232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hueynmyyjgdpujzulcupentgovlmzntq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247446.935749-331-130738039168850/AnsiballZ_modprobe.py'
Feb 16 13:10:47 compute-1 sudo[164232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:47 compute-1 python3.9[164234]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 16 13:10:47 compute-1 sudo[164232]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:48 compute-1 sudo[164388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjjefwqltzavrcgkrxvyaabjgjbxsaoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247447.7991083-347-184226482036857/AnsiballZ_stat.py'
Feb 16 13:10:48 compute-1 sudo[164388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:48 compute-1 python3.9[164390]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:48 compute-1 sudo[164388]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:48 compute-1 sudo[164511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-losuoptsighlabixiwvlhvndgcwusiug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247447.7991083-347-184226482036857/AnsiballZ_copy.py'
Feb 16 13:10:48 compute-1 sudo[164511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:48 compute-1 python3.9[164513]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247447.7991083-347-184226482036857/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:48 compute-1 sudo[164511]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:49 compute-1 sudo[164663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xybxuspgwwpphqulqckfqcejmcadjjkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247449.003625-379-33432050442490/AnsiballZ_lineinfile.py'
Feb 16 13:10:49 compute-1 sudo[164663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:49 compute-1 python3.9[164665]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:49 compute-1 sudo[164663]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:50 compute-1 sudo[164815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhqzypizezgdiylcwbqcnqjqubzzgbev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247449.6901069-395-121463956037731/AnsiballZ_systemd.py'
Feb 16 13:10:50 compute-1 sudo[164815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:50 compute-1 python3.9[164817]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:10:50 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 16 13:10:50 compute-1 systemd[1]: Stopped Load Kernel Modules.
Feb 16 13:10:50 compute-1 systemd[1]: Stopping Load Kernel Modules...
Feb 16 13:10:50 compute-1 systemd[1]: Starting Load Kernel Modules...
Feb 16 13:10:50 compute-1 systemd[1]: Finished Load Kernel Modules.
Feb 16 13:10:50 compute-1 sudo[164815]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:51 compute-1 sudo[164971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvswhccozkfxgovlerrvikorylufkwgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247450.8115513-412-140834645691820/AnsiballZ_command.py'
Feb 16 13:10:51 compute-1 sudo[164971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:51 compute-1 python3.9[164973]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:10:51 compute-1 sudo[164971]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:51 compute-1 sudo[165124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gswmcvcyclcvsvfxeophfaeccjxtesti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247451.6693609-431-260016682774011/AnsiballZ_stat.py'
Feb 16 13:10:51 compute-1 sudo[165124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:52 compute-1 python3.9[165126]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:10:52 compute-1 sudo[165124]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:52 compute-1 sudo[165276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pizyjyxutjqrvgjbfoimnbqwbkvaxxgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247452.4053202-449-76675053501732/AnsiballZ_stat.py'
Feb 16 13:10:52 compute-1 sudo[165276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:52 compute-1 python3.9[165278]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:10:52 compute-1 sudo[165276]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:53 compute-1 sudo[165399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cahxvpnsnrqdhxnxsncvgssqqddqabty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247452.4053202-449-76675053501732/AnsiballZ_copy.py'
Feb 16 13:10:53 compute-1 sudo[165399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:53 compute-1 python3.9[165401]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247452.4053202-449-76675053501732/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:53 compute-1 sudo[165399]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:53 compute-1 sudo[165551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvvjoohhbeeslrtranbefzwtwpgkchkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247453.5928483-479-7970149769931/AnsiballZ_command.py'
Feb 16 13:10:53 compute-1 sudo[165551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:54 compute-1 python3.9[165553]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:10:54 compute-1 sudo[165551]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:54 compute-1 sudo[165704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfbvhezfgahyrdbjpydicmdsobzvjxbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247454.264746-495-273840774299626/AnsiballZ_lineinfile.py'
Feb 16 13:10:54 compute-1 sudo[165704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:54 compute-1 python3.9[165706]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:54 compute-1 sudo[165704]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:55 compute-1 sudo[165856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ackeifnlywwkgjmpmjspnpzusviucbjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247454.9378672-511-75488218020213/AnsiballZ_replace.py'
Feb 16 13:10:55 compute-1 sudo[165856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:55 compute-1 python3.9[165858]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:55 compute-1 sudo[165856]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:56 compute-1 sudo[166008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqhlvdqqfyodezoftrqwsupxxddhcdtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247455.7872307-527-253476939551959/AnsiballZ_replace.py'
Feb 16 13:10:56 compute-1 sudo[166008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:56 compute-1 python3.9[166010]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:56 compute-1 sudo[166008]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:56 compute-1 sudo[166160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezwawyhyjurlfihcowtadxmhwevcccrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247456.5107334-545-3152177949803/AnsiballZ_lineinfile.py'
Feb 16 13:10:56 compute-1 sudo[166160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:56 compute-1 python3.9[166162]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:56 compute-1 sudo[166160]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:57 compute-1 sudo[166312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfrxauzsoajhfyihhczvmxmrolucajgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247457.1557598-545-60723471320734/AnsiballZ_lineinfile.py'
Feb 16 13:10:57 compute-1 sudo[166312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:57 compute-1 python3.9[166314]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:57 compute-1 sudo[166312]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:58 compute-1 sudo[166464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scldahbwtsarfrkmpykffpxsofubjzue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247457.8256912-545-17872670665126/AnsiballZ_lineinfile.py'
Feb 16 13:10:58 compute-1 sudo[166464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:58 compute-1 python3.9[166466]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:58 compute-1 sudo[166464]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:58 compute-1 sudo[166616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwcoabglsthawmlcwbrqtuizbixobaej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247458.466888-545-141991469798915/AnsiballZ_lineinfile.py'
Feb 16 13:10:58 compute-1 sudo[166616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:58 compute-1 python3.9[166618]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:10:58 compute-1 sudo[166616]: pam_unix(sudo:session): session closed for user root
Feb 16 13:10:59 compute-1 sudo[166768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kphvhimfmxppwejbralhkedqtcdwgznr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247459.1431441-603-242005518500275/AnsiballZ_stat.py'
Feb 16 13:10:59 compute-1 sudo[166768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:10:59 compute-1 python3.9[166770]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:10:59 compute-1 sudo[166768]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:00 compute-1 sudo[166922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqgbsnvelqzdscqjgasftwanhgzekzzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247459.829208-619-200186599788065/AnsiballZ_command.py'
Feb 16 13:11:00 compute-1 sudo[166922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:00 compute-1 python3.9[166924]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:00 compute-1 sudo[166922]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:00 compute-1 sudo[167075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqkyaxfksrxyplzabdutmonvfnvawysy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247460.5744758-637-89036104364358/AnsiballZ_systemd_service.py'
Feb 16 13:11:00 compute-1 sudo[167075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:01 compute-1 python3.9[167077]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:01 compute-1 systemd[1]: Listening on multipathd control socket.
Feb 16 13:11:01 compute-1 sudo[167075]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:01 compute-1 sudo[167231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmjubnmqmwsbeqrmyunlpudwjingcqfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247461.3917778-653-120578804075880/AnsiballZ_systemd_service.py'
Feb 16 13:11:01 compute-1 sudo[167231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:01 compute-1 python3.9[167233]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:02 compute-1 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 16 13:11:02 compute-1 udevadm[167238]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 16 13:11:02 compute-1 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 16 13:11:02 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 16 13:11:02 compute-1 multipathd[167242]: --------start up--------
Feb 16 13:11:02 compute-1 multipathd[167242]: read /etc/multipath.conf
Feb 16 13:11:02 compute-1 multipathd[167242]: path checkers start up
Feb 16 13:11:02 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 16 13:11:02 compute-1 sudo[167231]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:02 compute-1 sudo[167399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmxjnhekavcsvsgxbuyfeocrwerhoobl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247462.660043-678-82310910871940/AnsiballZ_file.py'
Feb 16 13:11:02 compute-1 sudo[167399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:03 compute-1 python3.9[167401]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 16 13:11:03 compute-1 sudo[167399]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:11:03.312 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:11:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:11:03.314 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:11:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:11:03.314 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:11:03 compute-1 sudo[167551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmarteektwbgezeyqjgnvoqtnhbsqwzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247463.4053123-693-26381782720889/AnsiballZ_modprobe.py'
Feb 16 13:11:03 compute-1 sudo[167551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:03 compute-1 python3.9[167553]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 16 13:11:03 compute-1 kernel: Key type psk registered
Feb 16 13:11:03 compute-1 sudo[167551]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:04 compute-1 sudo[167715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnabfoforkcwgvzniygsqpvgodokknbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247464.1027226-709-35450011092358/AnsiballZ_stat.py'
Feb 16 13:11:04 compute-1 sudo[167715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:04 compute-1 python3.9[167717]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:11:04 compute-1 sudo[167715]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:05 compute-1 sudo[167838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moxxwdfwzyvxeswegyxvgcgmtrbjwwnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247464.1027226-709-35450011092358/AnsiballZ_copy.py'
Feb 16 13:11:05 compute-1 sudo[167838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:05 compute-1 python3.9[167840]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247464.1027226-709-35450011092358/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:05 compute-1 sudo[167838]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:05 compute-1 sudo[167990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmsukfcxpsdcozysnudrlzyrpvhuciyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247465.5909722-741-930832095267/AnsiballZ_lineinfile.py'
Feb 16 13:11:05 compute-1 sudo[167990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:06 compute-1 python3.9[167992]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:06 compute-1 sudo[167990]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:06 compute-1 sudo[168151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvobxcqyytmfjliacwfljuwjivlbpffx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247466.2949975-757-115181243901104/AnsiballZ_systemd.py'
Feb 16 13:11:06 compute-1 sudo[168151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:06 compute-1 podman[168116]: 2026-02-16 13:11:06.934236456 +0000 UTC m=+0.078868468 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 16 13:11:07 compute-1 python3.9[168160]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:11:07 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 16 13:11:07 compute-1 systemd[1]: Stopped Load Kernel Modules.
Feb 16 13:11:07 compute-1 systemd[1]: Stopping Load Kernel Modules...
Feb 16 13:11:07 compute-1 systemd[1]: Starting Load Kernel Modules...
Feb 16 13:11:07 compute-1 systemd[1]: Finished Load Kernel Modules.
Feb 16 13:11:07 compute-1 sudo[168151]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:07 compute-1 sudo[168318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cojabjgccdnublziaouerefybpybwjjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247467.6802602-773-33714172586688/AnsiballZ_dnf.py'
Feb 16 13:11:07 compute-1 sudo[168318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:08 compute-1 python3.9[168320]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 13:11:09 compute-1 podman[168325]: 2026-02-16 13:11:09.9342013 +0000 UTC m=+0.075171802 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 16 13:11:10 compute-1 systemd[1]: Reloading.
Feb 16 13:11:10 compute-1 systemd-rc-local-generator[168374]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:10 compute-1 systemd-sysv-generator[168378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:10 compute-1 systemd[1]: Reloading.
Feb 16 13:11:10 compute-1 systemd-rc-local-generator[168422]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:10 compute-1 systemd-sysv-generator[168425]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:10 compute-1 systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 16 13:11:10 compute-1 systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 16 13:11:11 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 13:11:11 compute-1 systemd[1]: Starting man-db-cache-update.service...
Feb 16 13:11:11 compute-1 systemd[1]: Reloading.
Feb 16 13:11:11 compute-1 systemd-sysv-generator[168522]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:11 compute-1 systemd-rc-local-generator[168516]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:11 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 13:11:11 compute-1 sudo[168318]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:12 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 13:11:12 compute-1 systemd[1]: Finished man-db-cache-update.service.
Feb 16 13:11:12 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.224s CPU time.
Feb 16 13:11:12 compute-1 systemd[1]: run-r41a7e589014c40278e2e6a25ac16002c.service: Deactivated successfully.
Feb 16 13:11:12 compute-1 sudo[169834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awgbcpqkibhjifffywytzniisydxwtnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247472.2858565-789-220822205025809/AnsiballZ_systemd_service.py'
Feb 16 13:11:12 compute-1 sudo[169834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:12 compute-1 python3.9[169836]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:11:12 compute-1 systemd[1]: Stopping Open-iSCSI...
Feb 16 13:11:12 compute-1 iscsid[163286]: iscsid shutting down.
Feb 16 13:11:12 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Feb 16 13:11:12 compute-1 systemd[1]: Stopped Open-iSCSI.
Feb 16 13:11:12 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 16 13:11:12 compute-1 systemd[1]: Starting Open-iSCSI...
Feb 16 13:11:12 compute-1 systemd[1]: Started Open-iSCSI.
Feb 16 13:11:12 compute-1 sudo[169834]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:13 compute-1 sudo[169990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwfkcqwfsfctagmvktrurihzvnqgqldv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247473.1633978-805-249307863192732/AnsiballZ_systemd_service.py'
Feb 16 13:11:13 compute-1 sudo[169990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:13 compute-1 python3.9[169992]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:11:13 compute-1 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 16 13:11:13 compute-1 multipathd[167242]: exit (signal)
Feb 16 13:11:13 compute-1 multipathd[167242]: --------shut down-------
Feb 16 13:11:13 compute-1 systemd[1]: multipathd.service: Deactivated successfully.
Feb 16 13:11:13 compute-1 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 16 13:11:13 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 16 13:11:13 compute-1 multipathd[169998]: --------start up--------
Feb 16 13:11:13 compute-1 multipathd[169998]: read /etc/multipath.conf
Feb 16 13:11:13 compute-1 multipathd[169998]: path checkers start up
Feb 16 13:11:13 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 16 13:11:13 compute-1 sudo[169990]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:14 compute-1 python3.9[170155]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:11:15 compute-1 sudo[170309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uegyvphhtoofjqjsreaxmghfaeblauvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247475.4475298-840-276342645133603/AnsiballZ_file.py'
Feb 16 13:11:15 compute-1 sudo[170309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:15 compute-1 python3.9[170311]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:15 compute-1 sudo[170309]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:16 compute-1 sudo[170461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efrnwiooeqxnuddduwpcyjusnqaodjzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247476.4927304-862-163911157831245/AnsiballZ_systemd_service.py'
Feb 16 13:11:16 compute-1 sudo[170461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:17 compute-1 python3.9[170463]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:11:17 compute-1 systemd[1]: Reloading.
Feb 16 13:11:17 compute-1 systemd-rc-local-generator[170487]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:17 compute-1 systemd-sysv-generator[170490]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:17 compute-1 sudo[170461]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:18 compute-1 python3.9[170654]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:11:18 compute-1 network[170671]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:11:18 compute-1 network[170672]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:11:18 compute-1 network[170673]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:11:22 compute-1 sudo[170944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxwxgysufqvvxzjhqgvblihhisnxdbfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247481.9986725-900-182215906244653/AnsiballZ_systemd_service.py'
Feb 16 13:11:22 compute-1 sudo[170944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:22 compute-1 python3.9[170946]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:22 compute-1 sudo[170944]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:23 compute-1 sudo[171097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldardinjehmamcoscsolddgegvvsliqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247482.9924326-900-158871436684362/AnsiballZ_systemd_service.py'
Feb 16 13:11:23 compute-1 sudo[171097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:24 compute-1 python3.9[171099]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:24 compute-1 sudo[171097]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:24 compute-1 sudo[171250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koogzlkpcjehbmxhqzhysjhwwfvjzrel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247484.373586-900-36624516828950/AnsiballZ_systemd_service.py'
Feb 16 13:11:24 compute-1 sudo[171250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:25 compute-1 python3.9[171252]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:25 compute-1 sudo[171250]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:25 compute-1 sudo[171403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcodsqqyqhmfaivqycbownfqndrjxgyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247485.2798896-900-59062725383583/AnsiballZ_systemd_service.py'
Feb 16 13:11:25 compute-1 sudo[171403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:25 compute-1 python3.9[171405]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:25 compute-1 sudo[171403]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:26 compute-1 sudo[171556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdfzjsmrrcmpytmddpzqfegkxfgvbauo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247486.0616012-900-57841410649163/AnsiballZ_systemd_service.py'
Feb 16 13:11:26 compute-1 sudo[171556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:26 compute-1 python3.9[171558]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:26 compute-1 sudo[171556]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:27 compute-1 sudo[171709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxprphonosuwnccfndnzcykzfzhlhznl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247486.81951-900-240949695945916/AnsiballZ_systemd_service.py'
Feb 16 13:11:27 compute-1 sudo[171709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:27 compute-1 python3.9[171711]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:27 compute-1 sudo[171709]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:27 compute-1 sudo[171862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jahiuxuqdkauzvhaprubboypdxrwlbrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247487.5910792-900-212396906543934/AnsiballZ_systemd_service.py'
Feb 16 13:11:27 compute-1 sudo[171862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:28 compute-1 python3.9[171864]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:29 compute-1 sudo[171862]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:29 compute-1 sudo[172015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mresjsztvxbnphzmoiiwvtvdgvdbjxlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247489.3980584-900-249815014668842/AnsiballZ_systemd_service.py'
Feb 16 13:11:29 compute-1 sudo[172015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:29 compute-1 python3.9[172017]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:11:29 compute-1 sudo[172015]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:30 compute-1 sudo[172168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzkcjyvaheljmesqgvsdnjagklsoqibg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247490.5094302-1018-62396358268517/AnsiballZ_file.py'
Feb 16 13:11:30 compute-1 sudo[172168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:30 compute-1 python3.9[172170]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:31 compute-1 sudo[172168]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:31 compute-1 sudo[172320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iasyvimafyvjgbfylxofnhioekhpogof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247491.1581452-1018-164394900412988/AnsiballZ_file.py'
Feb 16 13:11:31 compute-1 sudo[172320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:31 compute-1 python3.9[172322]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:31 compute-1 sudo[172320]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:32 compute-1 sudo[172472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umrlozrufiftxunicvktdelljywteasx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247491.791249-1018-29516639781836/AnsiballZ_file.py'
Feb 16 13:11:32 compute-1 sudo[172472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:32 compute-1 python3.9[172474]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:32 compute-1 sudo[172472]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:32 compute-1 sudo[172624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glfwshvabiucnchhsemmxqddsuwhjkjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247492.6837707-1018-250552022417738/AnsiballZ_file.py'
Feb 16 13:11:32 compute-1 sudo[172624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:33 compute-1 python3.9[172626]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:33 compute-1 sudo[172624]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:33 compute-1 sudo[172776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jquiplvjdbczwjklvszsibpwnoydnfdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247493.3373332-1018-20556979572247/AnsiballZ_file.py'
Feb 16 13:11:33 compute-1 sudo[172776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:33 compute-1 python3.9[172778]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:33 compute-1 sudo[172776]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:34 compute-1 sudo[172930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvcsgaxvdbinbqcdmsdzoexcixnaabrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247493.9984286-1018-203573218693901/AnsiballZ_file.py'
Feb 16 13:11:34 compute-1 sudo[172930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:34 compute-1 python3.9[172932]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:34 compute-1 sudo[172930]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:34 compute-1 sshd-session[172878]: Connection closed by authenticating user root 146.190.226.24 port 34396 [preauth]
Feb 16 13:11:34 compute-1 sudo[173082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xroinmupaiytuxczwuwinhodnahwmitb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247494.570795-1018-277678056295796/AnsiballZ_file.py'
Feb 16 13:11:34 compute-1 sudo[173082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:35 compute-1 python3.9[173084]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:35 compute-1 sudo[173082]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:35 compute-1 sudo[173234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-welfwpxcqljzojutflcmgdnlswkhuxyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247495.206228-1018-164863470734862/AnsiballZ_file.py'
Feb 16 13:11:35 compute-1 sudo[173234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:35 compute-1 python3.9[173236]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:35 compute-1 sudo[173234]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:36 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 16 13:11:36 compute-1 sudo[173387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzwtsrmkkoccskwuyrqdlgsyojvbwlkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247496.0341773-1132-230292437273604/AnsiballZ_file.py'
Feb 16 13:11:36 compute-1 sudo[173387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:36 compute-1 python3.9[173389]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:36 compute-1 sudo[173387]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:37 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:11:37 compute-1 podman[173490]: 2026-02-16 13:11:37.467235869 +0000 UTC m=+0.080132573 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:11:37 compute-1 sudo[173560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjmhkvdamileioocblutoiugcjvbhuzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247497.0709205-1132-32173396458904/AnsiballZ_file.py'
Feb 16 13:11:37 compute-1 sudo[173560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:37 compute-1 python3.9[173562]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:37 compute-1 sudo[173560]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:38 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Feb 16 13:11:38 compute-1 sudo[173713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faijblnyydbdkqmabyeusgfkxztsaakp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247498.1407175-1132-114381585155181/AnsiballZ_file.py'
Feb 16 13:11:38 compute-1 sudo[173713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:38 compute-1 python3.9[173715]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:38 compute-1 sudo[173713]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:39 compute-1 sudo[173865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtcsjrluorqlhzlzuktshoudjfzenjvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247498.8354394-1132-59511119010730/AnsiballZ_file.py'
Feb 16 13:11:39 compute-1 sudo[173865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:39 compute-1 python3.9[173867]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:39 compute-1 sudo[173865]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:39 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 16 13:11:39 compute-1 sudo[174018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxpfwwemznxupqmsdxpayqpemcjbcssa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247499.4921134-1132-232556590915916/AnsiballZ_file.py'
Feb 16 13:11:39 compute-1 sudo[174018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:39 compute-1 python3.9[174020]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:39 compute-1 sudo[174018]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:40 compute-1 sudo[174183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmkexingscrckpwhfvvbnbgupjjoafvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247500.275756-1132-115400750469545/AnsiballZ_file.py'
Feb 16 13:11:40 compute-1 sudo[174183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:40 compute-1 podman[174144]: 2026-02-16 13:11:40.649284186 +0000 UTC m=+0.098945084 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:11:40 compute-1 python3.9[174192]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:40 compute-1 sudo[174183]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:41 compute-1 sudo[174349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkdirpiyraeuhfkwbntztutivzcvimwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247500.9431946-1132-76840111642042/AnsiballZ_file.py'
Feb 16 13:11:41 compute-1 sudo[174349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:41 compute-1 python3.9[174351]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:41 compute-1 sudo[174349]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:42 compute-1 sudo[174501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzsptukavkstpzcazohxaffccwivefwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247501.6773167-1132-247505646979411/AnsiballZ_file.py'
Feb 16 13:11:42 compute-1 sudo[174501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:42 compute-1 python3.9[174503]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:11:42 compute-1 sudo[174501]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:42 compute-1 sudo[174653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jerbohrnzqirmmzzgdowhkczscfacdal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247502.5663254-1248-231411035180892/AnsiballZ_command.py'
Feb 16 13:11:42 compute-1 sudo[174653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:43 compute-1 python3.9[174655]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:43 compute-1 sudo[174653]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:44 compute-1 python3.9[174807]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:11:45 compute-1 sudo[174957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uogzagzjwyhdsudlvwelwyefpugzchlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247504.9153972-1284-141012670159304/AnsiballZ_systemd_service.py'
Feb 16 13:11:45 compute-1 sudo[174957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:45 compute-1 python3.9[174959]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:11:45 compute-1 systemd[1]: Reloading.
Feb 16 13:11:45 compute-1 systemd-sysv-generator[174988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:11:45 compute-1 systemd-rc-local-generator[174983]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:11:45 compute-1 sudo[174957]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:46 compute-1 sudo[175151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mogdipjmgcicwozdvjmdxdlwxhgovbjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247506.0967097-1301-144925347538971/AnsiballZ_command.py'
Feb 16 13:11:46 compute-1 sudo[175151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:46 compute-1 python3.9[175153]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:46 compute-1 sudo[175151]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:46 compute-1 sudo[175304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ierjwtfwuojbhdsqeiszxtpmpjlhqkdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247506.7434378-1301-51354196786768/AnsiballZ_command.py'
Feb 16 13:11:46 compute-1 sudo[175304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:47 compute-1 python3.9[175306]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:47 compute-1 sudo[175304]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:47 compute-1 sudo[175457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoolhsfigftqnwvrpimzwbmwhtfguskw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247507.3401783-1301-237660504136178/AnsiballZ_command.py'
Feb 16 13:11:47 compute-1 sudo[175457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:47 compute-1 python3.9[175459]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:47 compute-1 sudo[175457]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:48 compute-1 sudo[175610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ombuayzzvumkmjkltknajydvzmdznvwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247508.0074542-1301-164001518737160/AnsiballZ_command.py'
Feb 16 13:11:48 compute-1 sudo[175610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:48 compute-1 python3.9[175612]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:48 compute-1 sudo[175610]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:49 compute-1 sudo[175763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glfxxawmwgbnggpvexhutjhnlssgtyyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247509.0213268-1301-114247146342825/AnsiballZ_command.py'
Feb 16 13:11:49 compute-1 sudo[175763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:49 compute-1 python3.9[175765]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:49 compute-1 sudo[175763]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:50 compute-1 sudo[175916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwnxihhzqyjfqekekymsjiutfosmvoro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247509.709211-1301-239182992816624/AnsiballZ_command.py'
Feb 16 13:11:50 compute-1 sudo[175916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:50 compute-1 python3.9[175918]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:50 compute-1 sudo[175916]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:50 compute-1 sudo[176069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrhddebyxttmnlczhpnrlrmahbpfqzrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247510.3391933-1301-57809901248965/AnsiballZ_command.py'
Feb 16 13:11:50 compute-1 sudo[176069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:50 compute-1 python3.9[176071]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:50 compute-1 sudo[176069]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:51 compute-1 sudo[176222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igubfdlqoynkmbpebohwzvqigxymvfms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247510.9797769-1301-99696026553012/AnsiballZ_command.py'
Feb 16 13:11:51 compute-1 sudo[176222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:51 compute-1 python3.9[176224]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:11:51 compute-1 sudo[176222]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:52 compute-1 sudo[176375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgglzewxpitmsjgywreyufadsqyjteup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247512.7590117-1443-67508037585019/AnsiballZ_file.py'
Feb 16 13:11:52 compute-1 sudo[176375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:53 compute-1 python3.9[176377]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:53 compute-1 sudo[176375]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:53 compute-1 sudo[176527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhvkllylvhwqfqwtowyzzsdamonjvcpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247513.3648093-1443-220931363679760/AnsiballZ_file.py'
Feb 16 13:11:53 compute-1 sudo[176527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:53 compute-1 python3.9[176529]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:53 compute-1 sudo[176527]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:54 compute-1 sudo[176679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypksifdfbapgyjhvnitofqvntzvextph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247514.2048721-1473-116179949845916/AnsiballZ_file.py'
Feb 16 13:11:54 compute-1 sudo[176679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:54 compute-1 python3.9[176681]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:54 compute-1 sudo[176679]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:55 compute-1 sudo[176831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwtrltmpwgbmcgmtgtesgvixyziyxhpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247515.1383922-1473-1923801734334/AnsiballZ_file.py'
Feb 16 13:11:55 compute-1 sudo[176831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:55 compute-1 python3.9[176833]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:55 compute-1 sudo[176831]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:56 compute-1 sudo[176983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljvilayuzmuxnobtmfbmsbwharrlasql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247515.7516348-1473-195107826878936/AnsiballZ_file.py'
Feb 16 13:11:56 compute-1 sudo[176983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:56 compute-1 python3.9[176985]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:56 compute-1 sudo[176983]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:56 compute-1 sudo[177135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhuxcgmovkglwxekluzqucjxsasbkkvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247516.3977182-1473-96457962052184/AnsiballZ_file.py'
Feb 16 13:11:56 compute-1 sudo[177135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:56 compute-1 python3.9[177137]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:56 compute-1 sudo[177135]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:57 compute-1 sudo[177287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piwbrmkzecfinljtrwrrecdxechlgvyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247516.963805-1473-165704914855048/AnsiballZ_file.py'
Feb 16 13:11:57 compute-1 sudo[177287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:57 compute-1 python3.9[177289]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:57 compute-1 sudo[177287]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:57 compute-1 sudo[177439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maymdvtkjcwzkpwdevlxwofnzexykegt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247517.5648148-1473-216285749391415/AnsiballZ_file.py'
Feb 16 13:11:57 compute-1 sudo[177439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:58 compute-1 python3.9[177441]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:58 compute-1 sudo[177439]: pam_unix(sudo:session): session closed for user root
Feb 16 13:11:58 compute-1 sudo[177591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmqcgnjobyrmdslfnisuumdmjfzvkhdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247518.2465684-1473-4435970746221/AnsiballZ_file.py'
Feb 16 13:11:58 compute-1 sudo[177591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:11:58 compute-1 python3.9[177593]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:11:58 compute-1 sudo[177591]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:12:03.314 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:12:03.315 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:12:03.315 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:05 compute-1 sudo[177743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlswnvrqhduzfinseqlnwulqqpwjrisy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247524.8188457-1710-207994864293384/AnsiballZ_getent.py'
Feb 16 13:12:05 compute-1 sudo[177743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:05 compute-1 python3.9[177745]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 16 13:12:05 compute-1 sudo[177743]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:06 compute-1 sudo[177896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ertojxaiansgfpirlpfzwsffaxjgbtnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247525.6423252-1726-267071280775151/AnsiballZ_group.py'
Feb 16 13:12:06 compute-1 sudo[177896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:06 compute-1 python3.9[177898]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 13:12:06 compute-1 groupadd[177899]: group added to /etc/group: name=nova, GID=42436
Feb 16 13:12:06 compute-1 groupadd[177899]: group added to /etc/gshadow: name=nova
Feb 16 13:12:06 compute-1 groupadd[177899]: new group: name=nova, GID=42436
Feb 16 13:12:06 compute-1 sudo[177896]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:07 compute-1 sudo[178054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eklzvfavyltgnocgnevimupcdwdshpem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247526.5704331-1742-199561634478155/AnsiballZ_user.py'
Feb 16 13:12:07 compute-1 sudo[178054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:07 compute-1 python3.9[178056]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 13:12:07 compute-1 useradd[178058]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Feb 16 13:12:07 compute-1 useradd[178058]: add 'nova' to group 'libvirt'
Feb 16 13:12:07 compute-1 useradd[178058]: add 'nova' to shadow group 'libvirt'
Feb 16 13:12:07 compute-1 sudo[178054]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:07 compute-1 podman[178065]: 2026-02-16 13:12:07.623087216 +0000 UTC m=+0.096965296 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 13:12:08 compute-1 sshd-session[178107]: Accepted publickey for zuul from 192.168.122.30 port 56330 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:12:08 compute-1 systemd-logind[821]: New session 26 of user zuul.
Feb 16 13:12:08 compute-1 systemd[1]: Started Session 26 of User zuul.
Feb 16 13:12:08 compute-1 sshd-session[178107]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:12:09 compute-1 sshd-session[178110]: Received disconnect from 192.168.122.30 port 56330:11: disconnected by user
Feb 16 13:12:09 compute-1 sshd-session[178110]: Disconnected from user zuul 192.168.122.30 port 56330
Feb 16 13:12:09 compute-1 sshd-session[178107]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:12:09 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Feb 16 13:12:09 compute-1 systemd-logind[821]: Session 26 logged out. Waiting for processes to exit.
Feb 16 13:12:09 compute-1 systemd-logind[821]: Removed session 26.
Feb 16 13:12:09 compute-1 python3.9[178260]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:10 compute-1 python3.9[178336]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:10 compute-1 python3.9[178486]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:10 compute-1 podman[178581]: 2026-02-16 13:12:10.928983393 +0000 UTC m=+0.080835760 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 16 13:12:11 compute-1 python3.9[178620]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247530.1677315-1792-154318099849311/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:11 compute-1 python3.9[178783]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:12 compute-1 python3.9[178904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247531.2111683-1792-29307469623833/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:12 compute-1 python3.9[179054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:13 compute-1 python3.9[179175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247532.4025242-1792-219826678838319/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:13 compute-1 python3.9[179325]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:14 compute-1 python3.9[179446]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247533.5250201-1900-39869364551452/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:14 compute-1 sudo[179596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwejmertnrrhcurcxkyfdfkucwihoenf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247534.5934026-1930-72603692536533/AnsiballZ_file.py'
Feb 16 13:12:14 compute-1 sudo[179596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:15 compute-1 python3.9[179598]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:15 compute-1 sudo[179596]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:15 compute-1 sudo[179748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zojishzujjlkqqkxynkgewgvfmycfbky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247535.2101374-1946-196804162966292/AnsiballZ_copy.py'
Feb 16 13:12:15 compute-1 sudo[179748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:15 compute-1 python3.9[179750]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:15 compute-1 sudo[179748]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:16 compute-1 sudo[179900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvlqhbhycimtvlxasftxwkfosovipcfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247536.0592654-1962-215252901861380/AnsiballZ_stat.py'
Feb 16 13:12:16 compute-1 sudo[179900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:16 compute-1 python3.9[179902]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:16 compute-1 sudo[179900]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:17 compute-1 sudo[180052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-majlvmwzhibipjncejbjoiazkbgvykdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247536.8151424-1978-33092320180843/AnsiballZ_stat.py'
Feb 16 13:12:17 compute-1 sudo[180052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:17 compute-1 python3.9[180054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:17 compute-1 sudo[180052]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:17 compute-1 sudo[180175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmqowdvoduqpxqrzqztyuzeojfdehutd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247536.8151424-1978-33092320180843/AnsiballZ_copy.py'
Feb 16 13:12:17 compute-1 sudo[180175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:17 compute-1 python3.9[180177]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1771247536.8151424-1978-33092320180843/.source _original_basename=.oek0p6jr follow=False checksum=884f4c95e10e63dbe105731edd877113c02581a4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb 16 13:12:17 compute-1 sudo[180175]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:18 compute-1 python3.9[180329]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:19 compute-1 sudo[180481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqakhjynfkckxhhanjbqzeiydqzehtrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247539.2334125-2035-188352897470487/AnsiballZ_file.py'
Feb 16 13:12:19 compute-1 sudo[180481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:19 compute-1 python3.9[180483]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:19 compute-1 sudo[180481]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:20 compute-1 sudo[180633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkusmpvfvbbznvkirgjtzyrjlsxvuyjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247539.9570014-2050-46378792674063/AnsiballZ_file.py'
Feb 16 13:12:20 compute-1 sudo[180633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:20 compute-1 python3.9[180635]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:20 compute-1 sudo[180633]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:21 compute-1 python3.9[180785]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:23 compute-1 sudo[181206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhzhfqbxtnonrnhwwoyjbjmiekwwjygs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247543.36478-2118-189919006230490/AnsiballZ_container_config_data.py'
Feb 16 13:12:23 compute-1 sudo[181206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:23 compute-1 python3.9[181208]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 16 13:12:23 compute-1 sudo[181206]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:24 compute-1 sudo[181358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqnsxpkhpfdezfncljhwcswrrbcyjxga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247544.44298-2140-95264962770784/AnsiballZ_container_config_hash.py'
Feb 16 13:12:24 compute-1 sudo[181358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:25 compute-1 python3.9[181360]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:12:25 compute-1 sudo[181358]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:25 compute-1 sudo[181510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbavgvuhpjiezhlvaebrwbuzilvdvjwb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247545.4111655-2160-95509763144146/AnsiballZ_edpm_container_manage.py'
Feb 16 13:12:25 compute-1 sudo[181510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:26 compute-1 python3[181512]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:12:26 compute-1 podman[181548]: 2026-02-16 13:12:26.25311769 +0000 UTC m=+0.044714193 container create 0d2186e649f0a163be73b537090b7cf4c417eb94f2cb14bb5a0d4806d88b58bd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute_init, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, container_name=nova_compute_init)
Feb 16 13:12:26 compute-1 podman[181548]: 2026-02-16 13:12:26.22850475 +0000 UTC m=+0.020101283 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 16 13:12:26 compute-1 python3[181512]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1 --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 16 13:12:26 compute-1 sudo[181510]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:27 compute-1 sudo[181736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtksibraztlljdlixwwmyftwzbfhwjqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247546.6772819-2176-262087678656254/AnsiballZ_stat.py'
Feb 16 13:12:27 compute-1 sudo[181736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:27 compute-1 python3.9[181738]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:27 compute-1 sudo[181736]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:28 compute-1 python3.9[181890]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:12:29 compute-1 sudo[182040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pagtvtzwrlmjkglqsxtvwlmcvxsbadhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247549.1553864-2231-116711631300441/AnsiballZ_stat.py'
Feb 16 13:12:29 compute-1 sudo[182040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:29 compute-1 python3.9[182042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:29 compute-1 sudo[182040]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:29 compute-1 sudo[182165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukgfqgwnlowrkrtzoopmxrldzuyblqbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247549.1553864-2231-116711631300441/AnsiballZ_copy.py'
Feb 16 13:12:29 compute-1 sudo[182165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:30 compute-1 python3.9[182167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247549.1553864-2231-116711631300441/.source.yaml _original_basename=.in6uxcp_ follow=False checksum=35696d2121a915adbae8ecc15e69892c5fbd315a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:30 compute-1 sudo[182165]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:30 compute-1 sudo[182317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlncydtxjmozfdlfbsjxnkopmubzymce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247550.5324495-2264-166766587743782/AnsiballZ_file.py'
Feb 16 13:12:30 compute-1 sudo[182317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:30 compute-1 python3.9[182319]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:30 compute-1 sudo[182317]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:31 compute-1 sudo[182469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjkqvlcdeynxzgxjggflxdwfxxomoqhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247551.3110216-2281-198995089062201/AnsiballZ_file.py'
Feb 16 13:12:31 compute-1 sudo[182469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:31 compute-1 python3.9[182471]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:12:31 compute-1 sudo[182469]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:32 compute-1 sudo[182621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkhvjfvlwfrczhojsksbjsofpjyhkfzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247551.9401994-2296-15182672007200/AnsiballZ_stat.py'
Feb 16 13:12:32 compute-1 sudo[182621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:32 compute-1 python3.9[182623]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:32 compute-1 sudo[182621]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:32 compute-1 sudo[182744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnjzmqvagasliyknrvtyasqhkgsuasur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247551.9401994-2296-15182672007200/AnsiballZ_copy.py'
Feb 16 13:12:32 compute-1 sudo[182744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:32 compute-1 python3.9[182746]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247551.9401994-2296-15182672007200/.source.json _original_basename=.116g1ovo follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:32 compute-1 sudo[182744]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:33 compute-1 python3.9[182896]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:35 compute-1 sudo[183317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqzcayugiqmtmdiwftzyegxeedgddpwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247555.470707-2376-194960729165793/AnsiballZ_container_config_data.py'
Feb 16 13:12:35 compute-1 sudo[183317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:35 compute-1 python3.9[183319]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 16 13:12:35 compute-1 sudo[183317]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:36 compute-1 sudo[183469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyworleacumxoangbqdbwjqkmgvnjnom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247556.3569133-2398-257524848132116/AnsiballZ_container_config_hash.py'
Feb 16 13:12:36 compute-1 sudo[183469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:36 compute-1 python3.9[183471]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:12:36 compute-1 sudo[183469]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:37 compute-1 sudo[183621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frwjqsxbjlsabdmvbgtkpyqazedrfdgl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247557.1962144-2418-97880194327546/AnsiballZ_edpm_container_manage.py'
Feb 16 13:12:37 compute-1 sudo[183621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:37 compute-1 python3[183623]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:12:37 compute-1 podman[183661]: 2026-02-16 13:12:37.828638172 +0000 UTC m=+0.046807782 container create 3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, container_name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS)
Feb 16 13:12:37 compute-1 podman[183661]: 2026-02-16 13:12:37.803243107 +0000 UTC m=+0.021412727 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 16 13:12:37 compute-1 python3[183623]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1 --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 16 13:12:37 compute-1 podman[183675]: 2026-02-16 13:12:37.923148266 +0000 UTC m=+0.060032273 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:12:37 compute-1 sudo[183621]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:38 compute-1 sudo[183866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eukddhgeeitgdtzbnokjyviwuzyqtxsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247558.198955-2434-134816902578992/AnsiballZ_stat.py'
Feb 16 13:12:38 compute-1 sudo[183866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:38 compute-1 python3.9[183868]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:38 compute-1 sudo[183866]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:39 compute-1 sudo[184020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfcpgzbzvwkscxjsgrtymdeiepqhmdnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247558.9824889-2452-195444846487686/AnsiballZ_file.py'
Feb 16 13:12:39 compute-1 sudo[184020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:39 compute-1 python3.9[184022]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:39 compute-1 sudo[184020]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:39 compute-1 sudo[184096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odqixonswqgulujeejqzwpmlwyzvsvik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247558.9824889-2452-195444846487686/AnsiballZ_stat.py'
Feb 16 13:12:39 compute-1 sudo[184096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:40 compute-1 python3.9[184098]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:40 compute-1 sudo[184096]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:40 compute-1 sudo[184247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dftkfttuqtfshftnyzsqwdtkxgylsrkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247560.1655407-2452-41091595288/AnsiballZ_copy.py'
Feb 16 13:12:40 compute-1 sudo[184247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:40 compute-1 python3.9[184249]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247560.1655407-2452-41091595288/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:40 compute-1 sudo[184247]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:40 compute-1 sudo[184323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqhdxbznjnmuzyurxpjejsiupivgubab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247560.1655407-2452-41091595288/AnsiballZ_systemd.py'
Feb 16 13:12:40 compute-1 sudo[184323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:41 compute-1 python3.9[184325]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:12:41 compute-1 systemd[1]: Reloading.
Feb 16 13:12:41 compute-1 systemd-rc-local-generator[184371]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:12:41 compute-1 systemd-sysv-generator[184376]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:12:41 compute-1 podman[184329]: 2026-02-16 13:12:41.374117128 +0000 UTC m=+0.106925546 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 16 13:12:41 compute-1 sudo[184323]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:41 compute-1 sshd-session[184326]: Invalid user sol from 2.57.122.210 port 59136
Feb 16 13:12:41 compute-1 sshd-session[184326]: Connection closed by invalid user sol 2.57.122.210 port 59136 [preauth]
Feb 16 13:12:41 compute-1 sudo[184471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skihfmtjliyazrhlvigsylmxmohlyhhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247560.1655407-2452-41091595288/AnsiballZ_systemd.py'
Feb 16 13:12:41 compute-1 sudo[184471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:42 compute-1 python3.9[184473]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:12:42 compute-1 systemd[1]: Reloading.
Feb 16 13:12:42 compute-1 sshd-session[184335]: Connection closed by authenticating user root 146.190.226.24 port 36844 [preauth]
Feb 16 13:12:42 compute-1 systemd-sysv-generator[184502]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:12:42 compute-1 systemd-rc-local-generator[184499]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:12:42 compute-1 systemd[1]: Starting nova_compute container...
Feb 16 13:12:42 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:12:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:42 compute-1 podman[184520]: 2026-02-16 13:12:42.522953779 +0000 UTC m=+0.131483563 container init 3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:12:42 compute-1 podman[184520]: 2026-02-16 13:12:42.528502695 +0000 UTC m=+0.137032429 container start 3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 16 13:12:42 compute-1 podman[184520]: nova_compute
Feb 16 13:12:42 compute-1 nova_compute[184536]: + sudo -E kolla_set_configs
Feb 16 13:12:42 compute-1 systemd[1]: Started nova_compute container.
Feb 16 13:12:42 compute-1 sudo[184471]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Validating config file
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Copying service configuration files
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Deleting /etc/ceph
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Creating directory /etc/ceph
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /etc/ceph
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Writing out command to execute
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:42 compute-1 nova_compute[184536]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 13:12:42 compute-1 nova_compute[184536]: ++ cat /run_command
Feb 16 13:12:42 compute-1 nova_compute[184536]: + CMD=nova-compute
Feb 16 13:12:42 compute-1 nova_compute[184536]: + ARGS=
Feb 16 13:12:42 compute-1 nova_compute[184536]: + sudo kolla_copy_cacerts
Feb 16 13:12:42 compute-1 nova_compute[184536]: + [[ ! -n '' ]]
Feb 16 13:12:42 compute-1 nova_compute[184536]: + . kolla_extend_start
Feb 16 13:12:42 compute-1 nova_compute[184536]: Running command: 'nova-compute'
Feb 16 13:12:42 compute-1 nova_compute[184536]: + echo 'Running command: '\''nova-compute'\'''
Feb 16 13:12:42 compute-1 nova_compute[184536]: + umask 0022
Feb 16 13:12:42 compute-1 nova_compute[184536]: + exec nova-compute
Feb 16 13:12:43 compute-1 python3.9[184697]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:12:44 compute-1 sudo[184848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrbsnynpjqrtikohmdmsqqiocfdouzys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247564.03354-2542-268461076170275/AnsiballZ_stat.py'
Feb 16 13:12:44 compute-1 sudo[184848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:44 compute-1 python3.9[184850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:12:44 compute-1 nova_compute[184536]: 2026-02-16 13:12:44.480 184540 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:44 compute-1 nova_compute[184536]: 2026-02-16 13:12:44.481 184540 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:44 compute-1 nova_compute[184536]: 2026-02-16 13:12:44.481 184540 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:44 compute-1 nova_compute[184536]: 2026-02-16 13:12:44.481 184540 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 16 13:12:44 compute-1 sudo[184848]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:44 compute-1 nova_compute[184536]: 2026-02-16 13:12:44.640 184540 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:12:44 compute-1 nova_compute[184536]: 2026-02-16 13:12:44.654 184540 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:12:44 compute-1 nova_compute[184536]: 2026-02-16 13:12:44.654 184540 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 16 13:12:44 compute-1 sudo[184977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhkrlsvagurcxfztjiyalpzmmtxovrej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247564.03354-2542-268461076170275/AnsiballZ_copy.py'
Feb 16 13:12:44 compute-1 sudo[184977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:45 compute-1 python3.9[184979]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247564.03354-2542-268461076170275/.source.yaml _original_basename=.47max9_k follow=False checksum=1f7d38bfcf59309f34da6c109f1ea60c6e218870 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:12:45 compute-1 sudo[184977]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.168 184540 INFO nova.virt.driver [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.283 184540 INFO nova.compute.provider_config [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.301 184540 DEBUG oslo_concurrency.lockutils [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.302 184540 DEBUG oslo_concurrency.lockutils [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.302 184540 DEBUG oslo_concurrency.lockutils [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.302 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.303 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.303 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.303 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.303 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.303 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.303 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.303 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.304 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.304 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.304 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.304 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.304 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.304 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.305 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.305 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.305 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.305 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.305 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.305 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.306 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.306 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.306 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.306 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.306 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.306 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.307 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.307 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.307 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.307 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.307 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.308 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.308 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.308 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.308 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.308 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.308 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.308 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.309 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.309 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.309 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.309 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.309 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.310 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.310 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.310 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.310 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.310 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.310 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.310 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.311 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.311 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.311 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.311 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.311 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.312 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.312 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.312 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.312 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.312 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.312 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.312 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.313 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.313 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.313 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.313 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.313 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.313 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.314 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.314 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.314 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.314 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.314 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.314 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.315 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.315 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.315 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.315 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.315 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.315 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.315 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.316 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.316 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.316 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.316 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.316 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.316 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.317 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.317 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.317 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.317 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.317 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.317 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.317 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.317 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.318 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.318 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.318 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.318 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.318 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.318 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.319 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.319 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.319 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.319 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.319 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.319 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.319 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.319 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.320 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.320 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.320 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.320 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.320 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.320 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.321 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.321 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.321 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.321 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.321 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.321 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.322 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.322 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.322 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.322 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.322 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.323 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.323 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.323 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.323 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.323 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.323 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.324 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.324 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.324 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.324 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.324 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.325 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.325 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.325 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.325 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.325 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.325 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.326 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.326 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.326 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.326 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.326 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.326 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.326 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.327 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.327 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.327 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.327 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.327 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.327 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.328 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.328 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.328 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.328 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.328 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.328 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.329 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.329 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.329 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.329 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.329 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.330 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.330 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.330 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.330 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.330 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.330 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.330 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.331 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.331 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.331 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.331 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.331 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.331 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.332 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.332 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.332 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.332 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.332 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.332 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.332 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.332 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.333 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.333 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.333 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.333 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.333 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.333 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.333 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.334 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.334 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.334 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.334 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.334 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.334 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.334 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.335 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.335 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.335 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.335 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.335 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.335 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.335 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.336 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.336 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.336 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.336 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.336 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.336 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.336 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.337 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.337 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.337 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.337 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.337 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.337 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.338 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.338 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.338 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.338 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.338 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.338 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.338 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.339 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.339 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.339 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.339 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.339 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.339 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.339 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.339 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.340 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.340 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.340 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.340 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.340 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.340 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.340 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.341 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.341 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.341 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.341 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.341 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.341 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.341 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.342 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.342 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.342 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.342 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.342 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.342 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.342 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.342 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.343 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.343 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.343 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.343 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.343 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.343 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.344 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.344 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.344 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.344 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.344 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.344 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.344 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.345 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.345 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.345 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.345 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.345 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.345 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.345 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.346 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.346 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.346 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.346 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.346 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.346 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.346 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.346 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.347 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.347 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.347 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.347 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.347 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.347 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.348 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.348 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.348 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.348 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.348 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.348 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.349 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.349 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.349 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.349 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.349 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.349 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.350 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.350 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.350 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.350 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.350 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.350 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.350 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.351 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.351 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.351 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.351 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.351 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.351 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.352 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.352 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.352 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.352 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.352 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.352 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.353 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.353 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.353 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.353 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.353 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.354 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.354 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.354 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.354 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.354 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.355 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.355 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.355 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.355 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.355 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.356 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.356 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.356 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.356 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.356 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.357 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.357 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.357 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.357 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.357 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.357 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.358 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.358 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.358 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.358 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.358 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.359 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.359 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.359 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.359 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.360 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.360 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.360 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.360 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.360 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.360 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.361 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.361 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.361 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.361 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.361 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.362 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.362 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.362 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.362 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.362 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.362 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.363 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.363 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.363 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.363 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.363 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.363 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.364 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.364 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.364 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.364 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.364 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.365 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.365 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.365 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.365 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.365 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.366 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.366 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.366 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.366 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.366 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.366 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.367 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.367 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.367 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.367 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.367 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.368 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.368 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.368 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.368 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.368 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.368 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.369 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.369 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.369 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.369 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.369 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.369 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.370 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.370 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.370 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.370 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.370 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.371 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.371 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.371 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.371 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.371 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.372 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.372 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.372 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.372 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.372 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.372 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.373 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.373 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.373 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.373 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.373 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.373 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.374 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.374 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.374 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.374 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.374 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.375 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.375 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.375 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.375 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.375 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.375 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.376 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.376 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.376 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.376 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.376 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.377 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.377 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.377 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.377 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.377 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.378 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.378 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.378 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.378 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.378 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.378 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.379 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.379 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.379 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.379 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.379 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.379 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.380 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.380 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.380 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.380 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.380 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.381 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.381 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.382 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.383 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.383 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.384 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.384 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.384 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.385 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.385 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.385 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.385 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.385 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.385 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.386 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.386 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.386 184540 WARNING oslo_config.cfg [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 16 13:12:45 compute-1 nova_compute[184536]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 16 13:12:45 compute-1 nova_compute[184536]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 16 13:12:45 compute-1 nova_compute[184536]: and ``live_migration_inbound_addr`` respectively.
Feb 16 13:12:45 compute-1 nova_compute[184536]: ).  Its value may be silently ignored in the future.
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.386 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.387 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.387 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.387 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.387 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.387 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.388 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.388 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.388 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.388 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.388 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.388 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.388 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.389 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.389 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.389 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.389 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.389 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.390 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.390 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.390 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.390 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.390 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.390 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.390 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.391 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.391 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.391 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.391 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.391 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.392 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.392 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.392 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.392 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.392 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.392 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.393 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.393 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.393 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.393 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.393 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.393 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.394 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.394 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.394 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.394 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.394 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.394 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.394 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.395 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.395 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.395 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.395 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.395 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.395 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.396 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.396 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.396 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.396 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.396 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.396 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.396 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.396 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.397 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.397 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.397 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.397 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.397 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.398 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.398 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.398 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.398 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.398 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.398 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.398 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.398 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.399 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.399 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.399 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.399 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.399 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.399 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.399 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.400 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.400 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.400 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.400 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.400 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.400 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.401 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.401 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.401 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.401 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.401 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.401 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.401 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.402 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.402 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.402 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.402 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.402 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.402 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.402 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.402 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.403 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.403 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.403 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.403 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.403 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.403 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.404 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.404 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.404 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.404 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.404 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.404 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.404 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.404 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.405 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.405 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.405 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.405 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.405 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.405 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.406 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.406 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.406 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.406 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.406 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.406 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.406 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.407 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.407 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.407 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.407 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.407 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.407 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.407 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.408 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.408 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.408 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.408 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.408 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.409 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.409 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.409 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.409 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.409 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.409 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.409 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.409 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.410 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.410 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.410 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.410 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.410 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.410 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.411 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.411 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.411 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.411 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.411 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.411 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.412 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.412 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.412 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.412 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.412 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.412 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.413 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.413 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.413 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.413 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.413 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.413 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.414 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.414 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.414 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.414 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.414 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.414 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.415 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.415 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.415 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.415 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.415 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.415 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.416 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.416 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.416 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.416 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.416 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.416 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.417 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.417 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.417 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.417 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.418 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.418 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.418 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.418 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.418 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.418 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.419 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.419 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.419 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.419 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.419 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.420 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.420 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.420 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.420 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.420 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.420 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.421 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.421 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.421 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.421 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.421 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.422 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.422 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.422 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.422 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.422 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.422 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.423 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.423 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.423 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.423 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.423 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.424 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.424 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.424 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.424 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.424 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.425 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.425 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.425 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.425 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.425 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.425 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.426 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.426 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.426 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.426 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.426 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.427 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.427 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.427 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.427 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.428 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.428 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.428 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.428 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.429 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.429 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.429 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.429 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.429 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.429 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.430 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.430 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.430 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.430 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.430 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.431 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.431 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.431 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.431 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.431 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.432 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.432 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.432 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.432 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.432 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.433 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.433 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.433 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.433 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.433 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.434 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.434 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.434 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.434 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.434 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.434 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.435 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.435 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.435 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.435 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.435 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.436 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.436 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.436 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.436 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.436 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.437 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.437 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.437 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.437 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.437 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.437 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.438 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.438 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.438 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.438 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.438 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.438 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.438 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.439 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.439 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.439 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.439 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.439 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.439 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.439 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.440 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.440 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.440 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.440 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.440 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.440 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.440 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.441 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.441 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.441 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.441 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.441 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.441 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.441 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.442 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.442 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.442 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.442 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.442 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.442 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.442 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.443 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.443 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.443 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.443 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.443 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.443 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.444 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.444 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.444 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.444 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.444 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.444 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.444 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.444 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.445 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.445 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.445 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.445 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.445 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.445 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.445 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.446 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.446 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.446 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.446 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.446 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.446 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.446 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.447 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.447 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.447 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.447 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.447 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.447 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.448 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.448 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.448 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.448 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.448 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.448 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.448 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.449 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.449 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.449 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.449 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.449 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.449 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.449 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.450 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.450 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.450 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.450 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.450 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.450 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.451 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.451 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.451 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.451 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.451 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.451 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.452 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.452 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.452 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.452 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.452 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.452 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.453 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.453 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.453 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.453 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.453 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.453 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.453 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.454 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.454 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.454 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.454 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.454 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.454 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.454 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.455 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.455 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.455 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.455 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.455 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.455 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.455 184540 DEBUG oslo_service.service [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.457 184540 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.472 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.473 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.474 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.474 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 16 13:12:45 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Feb 16 13:12:45 compute-1 systemd[1]: Started libvirt QEMU daemon.
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.543 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fa0594f09a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.546 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fa0594f09a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.547 184540 INFO nova.virt.libvirt.driver [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Connection event '1' reason 'None'
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.565 184540 WARNING nova.virt.libvirt.driver [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Feb 16 13:12:45 compute-1 nova_compute[184536]: 2026-02-16 13:12:45.566 184540 DEBUG nova.virt.libvirt.volume.mount [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 16 13:12:46 compute-1 python3.9[185181]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.310 184540 INFO nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Libvirt host capabilities <capabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]: 
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <host>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <uuid>66cb33d6-ab36-4d79-b328-a95e41812fbd</uuid>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <arch>x86_64</arch>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <microcode version='16777317'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <signature family='23' model='49' stepping='0'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='x2apic'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='tsc-deadline'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='osxsave'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='hypervisor'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='tsc_adjust'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='spec-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='stibp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='arch-capabilities'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='cmp_legacy'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='topoext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='virt-ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='lbrv'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='tsc-scale'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='vmcb-clean'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='pause-filter'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='pfthreshold'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='svme-addr-chk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='rdctl-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='skip-l1dfl-vmentry'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='mds-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature name='pschange-mc-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <pages unit='KiB' size='4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <pages unit='KiB' size='2048'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <pages unit='KiB' size='1048576'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <power_management>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <suspend_mem/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <suspend_disk/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <suspend_hybrid/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </power_management>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <iommu support='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <migration_features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <live/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <uri_transports>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <uri_transport>tcp</uri_transport>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <uri_transport>rdma</uri_transport>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </uri_transports>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </migration_features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <topology>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <cells num='1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <cell id='0'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:           <memory unit='KiB'>7864284</memory>
Feb 16 13:12:46 compute-1 nova_compute[184536]:           <pages unit='KiB' size='4'>1966071</pages>
Feb 16 13:12:46 compute-1 nova_compute[184536]:           <pages unit='KiB' size='2048'>0</pages>
Feb 16 13:12:46 compute-1 nova_compute[184536]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 16 13:12:46 compute-1 nova_compute[184536]:           <distances>
Feb 16 13:12:46 compute-1 nova_compute[184536]:             <sibling id='0' value='10'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:           </distances>
Feb 16 13:12:46 compute-1 nova_compute[184536]:           <cpus num='8'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:           </cpus>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         </cell>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </cells>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </topology>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <cache>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </cache>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <secmodel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model>selinux</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <doi>0</doi>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </secmodel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <secmodel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model>dac</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <doi>0</doi>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </secmodel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </host>
Feb 16 13:12:46 compute-1 nova_compute[184536]: 
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <guest>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <os_type>hvm</os_type>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <arch name='i686'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <wordsize>32</wordsize>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <domain type='qemu'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <domain type='kvm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </arch>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <pae/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <nonpae/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <acpi default='on' toggle='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <apic default='on' toggle='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <cpuselection/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <deviceboot/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <disksnapshot default='on' toggle='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <externalSnapshot/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </guest>
Feb 16 13:12:46 compute-1 nova_compute[184536]: 
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <guest>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <os_type>hvm</os_type>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <arch name='x86_64'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <wordsize>64</wordsize>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <domain type='qemu'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <domain type='kvm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </arch>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <acpi default='on' toggle='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <apic default='on' toggle='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <cpuselection/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <deviceboot/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <disksnapshot default='on' toggle='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <externalSnapshot/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </guest>
Feb 16 13:12:46 compute-1 nova_compute[184536]: 
Feb 16 13:12:46 compute-1 nova_compute[184536]: </capabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]: 
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.317 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.332 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 16 13:12:46 compute-1 nova_compute[184536]: <domainCapabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <domain>kvm</domain>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <arch>i686</arch>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <vcpu max='4096'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <iothreads supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <os supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <enum name='firmware'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <loader supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>rom</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pflash</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='readonly'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>yes</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>no</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='secure'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>no</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </loader>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </os>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>on</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>off</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='maximumMigratable'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>on</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>off</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='succor'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='custom' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='KnightsMill'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='athlon'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='athlon-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='core2duo'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='core2duo-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='coreduo'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='coreduo-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='n270'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='n270-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='phenom'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='phenom-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <memoryBacking supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <enum name='sourceType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>file</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>anonymous</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>memfd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </memoryBacking>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <devices>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <disk supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='diskDevice'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>disk</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>cdrom</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>floppy</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>lun</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='bus'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>fdc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>scsi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>sata</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </disk>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <graphics supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vnc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>egl-headless</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dbus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </graphics>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <video supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='modelType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vga</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>cirrus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>none</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>bochs</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ramfb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </video>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <hostdev supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='mode'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>subsystem</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='startupPolicy'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>default</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>mandatory</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>requisite</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>optional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='subsysType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pci</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>scsi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='capsType'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='pciBackend'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </hostdev>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <rng supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>random</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>egd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>builtin</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </rng>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <filesystem supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='driverType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>path</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>handle</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtiofs</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </filesystem>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <tpm supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tpm-tis</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tpm-crb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>emulator</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>external</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendVersion'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>2.0</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </tpm>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <redirdev supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='bus'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </redirdev>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <channel supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pty</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>unix</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </channel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <crypto supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>qemu</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>builtin</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </crypto>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <interface supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>default</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>passt</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </interface>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <panic supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>isa</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>hyperv</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </panic>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <console supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>null</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pty</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dev</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>file</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pipe</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>stdio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>udp</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tcp</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>unix</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>qemu-vdagent</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dbus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </console>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </devices>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <gic supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <genid supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <backup supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <async-teardown supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <s390-pv supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <ps2 supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <tdx supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <sev supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <sgx supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <hyperv supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='features'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>relaxed</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vapic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>spinlocks</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vpindex</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>runtime</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>synic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>stimer</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>reset</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vendor_id</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>frequencies</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>reenlightenment</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tlbflush</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ipi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>avic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>emsr_bitmap</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>xmm_input</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <defaults>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </defaults>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </hyperv>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <launchSecurity supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </features>
Feb 16 13:12:46 compute-1 nova_compute[184536]: </domainCapabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.339 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 16 13:12:46 compute-1 nova_compute[184536]: <domainCapabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <domain>kvm</domain>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <arch>i686</arch>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <vcpu max='240'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <iothreads supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <os supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <enum name='firmware'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <loader supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>rom</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pflash</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='readonly'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>yes</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>no</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='secure'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>no</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </loader>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </os>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>on</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>off</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='maximumMigratable'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>on</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>off</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='succor'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='custom' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='KnightsMill'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='athlon'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='athlon-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='core2duo'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='core2duo-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='coreduo'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='coreduo-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='n270'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='n270-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='phenom'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='phenom-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <memoryBacking supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <enum name='sourceType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>file</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>anonymous</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>memfd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </memoryBacking>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <devices>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <disk supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='diskDevice'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>disk</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>cdrom</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>floppy</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>lun</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='bus'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ide</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>fdc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>scsi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>sata</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </disk>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <graphics supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vnc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>egl-headless</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dbus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </graphics>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <video supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='modelType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vga</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>cirrus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>none</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>bochs</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ramfb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </video>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <hostdev supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='mode'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>subsystem</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='startupPolicy'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>default</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>mandatory</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>requisite</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>optional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='subsysType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pci</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>scsi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='capsType'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='pciBackend'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </hostdev>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <rng supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>random</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>egd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>builtin</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </rng>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <filesystem supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='driverType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>path</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>handle</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtiofs</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </filesystem>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <tpm supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tpm-tis</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tpm-crb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>emulator</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>external</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendVersion'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>2.0</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </tpm>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <redirdev supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='bus'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </redirdev>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <channel supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pty</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>unix</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </channel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <crypto supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>qemu</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>builtin</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </crypto>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <interface supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>default</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>passt</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </interface>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <panic supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>isa</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>hyperv</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </panic>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <console supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>null</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pty</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dev</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>file</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pipe</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>stdio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>udp</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tcp</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>unix</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>qemu-vdagent</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dbus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </console>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </devices>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <gic supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <genid supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <backup supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <async-teardown supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <s390-pv supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <ps2 supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <tdx supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <sev supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <sgx supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <hyperv supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='features'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>relaxed</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vapic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>spinlocks</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vpindex</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>runtime</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>synic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>stimer</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>reset</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vendor_id</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>frequencies</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>reenlightenment</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tlbflush</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ipi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>avic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>emsr_bitmap</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>xmm_input</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <defaults>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </defaults>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </hyperv>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <launchSecurity supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </features>
Feb 16 13:12:46 compute-1 nova_compute[184536]: </domainCapabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.388 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.393 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 16 13:12:46 compute-1 nova_compute[184536]: <domainCapabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <domain>kvm</domain>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <arch>x86_64</arch>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <vcpu max='4096'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <iothreads supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <os supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <enum name='firmware'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>efi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <loader supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>rom</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pflash</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='readonly'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>yes</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>no</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='secure'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>yes</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>no</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </loader>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </os>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>on</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>off</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='maximumMigratable'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>on</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>off</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='succor'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='custom' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='KnightsMill'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='athlon'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='athlon-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='core2duo'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='core2duo-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='coreduo'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='coreduo-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='n270'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='n270-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='phenom'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='phenom-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <memoryBacking supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <enum name='sourceType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>file</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>anonymous</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>memfd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </memoryBacking>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <devices>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <disk supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='diskDevice'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>disk</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>cdrom</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>floppy</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>lun</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='bus'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>fdc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>scsi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>sata</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </disk>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <graphics supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vnc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>egl-headless</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dbus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </graphics>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <video supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='modelType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vga</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>cirrus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>none</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>bochs</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ramfb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </video>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <hostdev supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='mode'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>subsystem</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='startupPolicy'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>default</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>mandatory</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>requisite</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>optional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='subsysType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pci</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>scsi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='capsType'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='pciBackend'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </hostdev>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <rng supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>random</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>egd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>builtin</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </rng>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <filesystem supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='driverType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>path</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>handle</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtiofs</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </filesystem>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <tpm supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tpm-tis</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tpm-crb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>emulator</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>external</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendVersion'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>2.0</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </tpm>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <redirdev supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='bus'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </redirdev>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <channel supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pty</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>unix</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </channel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <crypto supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>qemu</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>builtin</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </crypto>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <interface supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>default</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>passt</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </interface>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <panic supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>isa</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>hyperv</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </panic>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <console supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>null</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pty</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dev</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>file</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pipe</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>stdio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>udp</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tcp</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>unix</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>qemu-vdagent</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dbus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </console>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </devices>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <gic supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <genid supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <backup supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <async-teardown supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <s390-pv supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <ps2 supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <tdx supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <sev supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <sgx supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <hyperv supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='features'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>relaxed</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vapic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>spinlocks</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vpindex</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>runtime</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>synic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>stimer</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>reset</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vendor_id</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>frequencies</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>reenlightenment</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tlbflush</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ipi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>avic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>emsr_bitmap</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>xmm_input</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <defaults>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </defaults>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </hyperv>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <launchSecurity supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </features>
Feb 16 13:12:46 compute-1 nova_compute[184536]: </domainCapabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.457 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 16 13:12:46 compute-1 nova_compute[184536]: <domainCapabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <domain>kvm</domain>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <arch>x86_64</arch>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <vcpu max='240'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <iothreads supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <os supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <enum name='firmware'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <loader supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>rom</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pflash</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='readonly'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>yes</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>no</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='secure'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>no</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </loader>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </os>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>on</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>off</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='maximumMigratable'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>on</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>off</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <vendor>AMD</vendor>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='succor'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <mode name='custom' supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ddpd-u'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sha512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm3'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sm4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Denverton-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amd-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='auto-ibrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='perfmon-v2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbpb'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='stibp-always-on'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='EPYC-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-128'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-256'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx10-512'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='prefetchiti'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Haswell-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='KnightsMill'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512er'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512pf'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fma4'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tbm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xop'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='amx-tile'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-bf16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-fp16'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bitalg'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrc'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fzrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='la57'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='taa-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ifma'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cmpccxadd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fbsdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='fsrs'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ibrs-all'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='intel-psfd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='lam'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mcdt-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pbrsb-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='psdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='serialize'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vaes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='hle'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='rtm'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512bw'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512cd'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512dq'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512f'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='avx512vl'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='invpcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pcid'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='pku'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='mpx'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='core-capability'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='split-lock-detect'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='cldemote'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='erms'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='gfni'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdir64b'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='movdiri'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='xsaves'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='athlon'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='athlon-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='core2duo'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='core2duo-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='coreduo'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='coreduo-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='n270'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='n270-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='ss'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='phenom'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <blockers model='phenom-v1'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnow'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <feature name='3dnowext'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </blockers>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </mode>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <memoryBacking supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <enum name='sourceType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>file</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>anonymous</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <value>memfd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </memoryBacking>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <devices>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <disk supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='diskDevice'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>disk</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>cdrom</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>floppy</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>lun</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='bus'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ide</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>fdc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>scsi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>sata</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </disk>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <graphics supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vnc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>egl-headless</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dbus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </graphics>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <video supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='modelType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vga</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>cirrus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>none</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>bochs</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ramfb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </video>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <hostdev supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='mode'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>subsystem</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='startupPolicy'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>default</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>mandatory</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>requisite</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>optional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='subsysType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pci</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>scsi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='capsType'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='pciBackend'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </hostdev>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <rng supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtio-non-transitional</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>random</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>egd</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>builtin</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </rng>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <filesystem supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='driverType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>path</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>handle</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>virtiofs</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </filesystem>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <tpm supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tpm-tis</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tpm-crb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>emulator</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>external</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendVersion'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>2.0</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </tpm>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <redirdev supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='bus'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>usb</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </redirdev>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <channel supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pty</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>unix</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </channel>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <crypto supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>qemu</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendModel'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>builtin</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </crypto>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <interface supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='backendType'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>default</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>passt</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </interface>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <panic supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='model'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>isa</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>hyperv</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </panic>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <console supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='type'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>null</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vc</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pty</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dev</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>file</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>pipe</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>stdio</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>udp</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tcp</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>unix</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>qemu-vdagent</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>dbus</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </console>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </devices>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <features>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <gic supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <genid supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <backup supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <async-teardown supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <s390-pv supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <ps2 supported='yes'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <tdx supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <sev supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <sgx supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <hyperv supported='yes'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <enum name='features'>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>relaxed</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vapic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>spinlocks</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vpindex</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>runtime</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>synic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>stimer</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>reset</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>vendor_id</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>frequencies</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>reenlightenment</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>tlbflush</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>ipi</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>avic</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>emsr_bitmap</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <value>xmm_input</value>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </enum>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       <defaults>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:46 compute-1 nova_compute[184536]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:46 compute-1 nova_compute[184536]:       </defaults>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     </hyperv>
Feb 16 13:12:46 compute-1 nova_compute[184536]:     <launchSecurity supported='no'/>
Feb 16 13:12:46 compute-1 nova_compute[184536]:   </features>
Feb 16 13:12:46 compute-1 nova_compute[184536]: </domainCapabilities>
Feb 16 13:12:46 compute-1 nova_compute[184536]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.521 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.522 184540 INFO nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Secure Boot support detected
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.524 184540 INFO nova.virt.libvirt.driver [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.524 184540 INFO nova.virt.libvirt.driver [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.533 184540 DEBUG nova.virt.libvirt.driver [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] cpu compare xml: <cpu match="exact">
Feb 16 13:12:46 compute-1 nova_compute[184536]:   <model>Nehalem</model>
Feb 16 13:12:46 compute-1 nova_compute[184536]: </cpu>
Feb 16 13:12:46 compute-1 nova_compute[184536]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.536 184540 DEBUG nova.virt.libvirt.driver [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.576 184540 INFO nova.virt.node [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Determined node identity 63898862-3dd6-49b3-9545-63882243296a from /var/lib/nova/compute_id
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.603 184540 WARNING nova.compute.manager [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Compute nodes ['63898862-3dd6-49b3-9545-63882243296a'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.665 184540 INFO nova.compute.manager [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.703 184540 WARNING nova.compute.manager [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.703 184540 DEBUG oslo_concurrency.lockutils [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.703 184540 DEBUG oslo_concurrency.lockutils [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.704 184540 DEBUG oslo_concurrency.lockutils [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.704 184540 DEBUG nova.compute.resource_tracker [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:12:46 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Feb 16 13:12:46 compute-1 systemd[1]: Started libvirt nodedev daemon.
Feb 16 13:12:46 compute-1 python3.9[185343]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.972 184540 WARNING nova.virt.libvirt.driver [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.974 184540 DEBUG nova.compute.resource_tracker [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6152MB free_disk=73.43974304199219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.974 184540 DEBUG oslo_concurrency.lockutils [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:46 compute-1 nova_compute[184536]: 2026-02-16 13:12:46.974 184540 DEBUG oslo_concurrency.lockutils [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:47 compute-1 nova_compute[184536]: 2026-02-16 13:12:47.013 184540 WARNING nova.compute.resource_tracker [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] No compute node record for compute-1.ctlplane.example.com:63898862-3dd6-49b3-9545-63882243296a: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 63898862-3dd6-49b3-9545-63882243296a could not be found.
Feb 16 13:12:47 compute-1 nova_compute[184536]: 2026-02-16 13:12:47.055 184540 INFO nova.compute.resource_tracker [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 63898862-3dd6-49b3-9545-63882243296a
Feb 16 13:12:47 compute-1 nova_compute[184536]: 2026-02-16 13:12:47.393 184540 DEBUG nova.compute.resource_tracker [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:12:47 compute-1 nova_compute[184536]: 2026-02-16 13:12:47.394 184540 DEBUG nova.compute.resource_tracker [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:12:47 compute-1 python3.9[185516]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:12:47 compute-1 nova_compute[184536]: 2026-02-16 13:12:47.978 184540 INFO nova.scheduler.client.report [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] [req-856e4ddd-1e5a-483c-a84d-70bffa91f262] Created resource provider record via placement API for resource provider with UUID 63898862-3dd6-49b3-9545-63882243296a and name compute-1.ctlplane.example.com.
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.022 184540 DEBUG nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 16 13:12:48 compute-1 nova_compute[184536]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.022 184540 INFO nova.virt.libvirt.host [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] kernel doesn't support AMD SEV
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.023 184540 DEBUG nova.compute.provider_tree [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.023 184540 DEBUG nova.virt.libvirt.driver [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.025 184540 DEBUG nova.virt.libvirt.driver [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Libvirt baseline CPU <cpu>
Feb 16 13:12:48 compute-1 nova_compute[184536]:   <arch>x86_64</arch>
Feb 16 13:12:48 compute-1 nova_compute[184536]:   <model>Nehalem</model>
Feb 16 13:12:48 compute-1 nova_compute[184536]:   <vendor>AMD</vendor>
Feb 16 13:12:48 compute-1 nova_compute[184536]:   <topology sockets="8" cores="1" threads="1"/>
Feb 16 13:12:48 compute-1 nova_compute[184536]: </cpu>
Feb 16 13:12:48 compute-1 nova_compute[184536]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.076 184540 DEBUG nova.scheduler.client.report [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Updated inventory for provider 63898862-3dd6-49b3-9545-63882243296a with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.077 184540 DEBUG nova.compute.provider_tree [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.077 184540 DEBUG nova.compute.provider_tree [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.159 184540 DEBUG nova.compute.provider_tree [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.214 184540 DEBUG nova.compute.resource_tracker [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.214 184540 DEBUG oslo_concurrency.lockutils [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.214 184540 DEBUG nova.service [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.344 184540 DEBUG nova.service [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 16 13:12:48 compute-1 nova_compute[184536]: 2026-02-16 13:12:48.345 184540 DEBUG nova.servicegroup.drivers.db [None req-5807f575-b002-4995-b3ca-7207d2a86279 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 16 13:12:48 compute-1 sudo[185666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fezarlhdteulcnjxvjscjdsulwypkhxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247568.0000355-2642-58299638820134/AnsiballZ_podman_container.py'
Feb 16 13:12:48 compute-1 sudo[185666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:48 compute-1 python3.9[185668]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 16 13:12:48 compute-1 sudo[185666]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:48 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:12:48 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:12:49 compute-1 sudo[185843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htzfwsfqzhpyzlvgbpdqmboenuehvdrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247569.2056696-2658-1429857517744/AnsiballZ_systemd.py'
Feb 16 13:12:49 compute-1 sudo[185843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:49 compute-1 python3.9[185845]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 13:12:49 compute-1 systemd[1]: Stopping nova_compute container...
Feb 16 13:12:50 compute-1 nova_compute[184536]: 2026-02-16 13:12:50.413 184540 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 16 13:12:50 compute-1 nova_compute[184536]: 2026-02-16 13:12:50.416 184540 DEBUG oslo_concurrency.lockutils [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:12:50 compute-1 nova_compute[184536]: 2026-02-16 13:12:50.416 184540 DEBUG oslo_concurrency.lockutils [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:12:50 compute-1 nova_compute[184536]: 2026-02-16 13:12:50.416 184540 DEBUG oslo_concurrency.lockutils [None req-cca53244-cbef-48cc-9d5a-aee8b5fac058 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:12:51 compute-1 virtqemud[185025]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 16 13:12:51 compute-1 virtqemud[185025]: hostname: compute-1
Feb 16 13:12:51 compute-1 virtqemud[185025]: End of file while reading data: Input/output error
Feb 16 13:12:51 compute-1 systemd[1]: libpod-3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272.scope: Deactivated successfully.
Feb 16 13:12:51 compute-1 systemd[1]: libpod-3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272.scope: Consumed 3.411s CPU time.
Feb 16 13:12:51 compute-1 podman[185849]: 2026-02-16 13:12:51.03698833 +0000 UTC m=+1.187664903 container died 3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 13:12:51 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272-userdata-shm.mount: Deactivated successfully.
Feb 16 13:12:51 compute-1 systemd[1]: var-lib-containers-storage-overlay-d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f-merged.mount: Deactivated successfully.
Feb 16 13:12:51 compute-1 podman[185849]: 2026-02-16 13:12:51.096306635 +0000 UTC m=+1.246983198 container cleanup 3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:12:51 compute-1 podman[185849]: nova_compute
Feb 16 13:12:51 compute-1 podman[185880]: nova_compute
Feb 16 13:12:51 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 16 13:12:51 compute-1 systemd[1]: Stopped nova_compute container.
Feb 16 13:12:51 compute-1 systemd[1]: Starting nova_compute container...
Feb 16 13:12:51 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:12:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5d253ccd54da3ae3a92830dc5043152b14797cb1bec0dafc5d24941dce0950f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:51 compute-1 podman[185893]: 2026-02-16 13:12:51.283683944 +0000 UTC m=+0.094502024 container init 3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute)
Feb 16 13:12:51 compute-1 podman[185893]: 2026-02-16 13:12:51.290483758 +0000 UTC m=+0.101301808 container start 3c0e6b6bc1f13a896926932566b49f3efd7db9bd7235d26d8d37c412cd686272 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:12:51 compute-1 podman[185893]: nova_compute
Feb 16 13:12:51 compute-1 systemd[1]: Started nova_compute container.
Feb 16 13:12:51 compute-1 nova_compute[185910]: + sudo -E kolla_set_configs
Feb 16 13:12:51 compute-1 sudo[185843]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Validating config file
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Copying service configuration files
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Deleting /etc/ceph
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Creating directory /etc/ceph
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /etc/ceph
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Writing out command to execute
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:51 compute-1 nova_compute[185910]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 13:12:51 compute-1 nova_compute[185910]: ++ cat /run_command
Feb 16 13:12:51 compute-1 nova_compute[185910]: + CMD=nova-compute
Feb 16 13:12:51 compute-1 nova_compute[185910]: + ARGS=
Feb 16 13:12:51 compute-1 nova_compute[185910]: + sudo kolla_copy_cacerts
Feb 16 13:12:51 compute-1 nova_compute[185910]: + [[ ! -n '' ]]
Feb 16 13:12:51 compute-1 nova_compute[185910]: + . kolla_extend_start
Feb 16 13:12:51 compute-1 nova_compute[185910]: Running command: 'nova-compute'
Feb 16 13:12:51 compute-1 nova_compute[185910]: + echo 'Running command: '\''nova-compute'\'''
Feb 16 13:12:51 compute-1 nova_compute[185910]: + umask 0022
Feb 16 13:12:51 compute-1 nova_compute[185910]: + exec nova-compute
Feb 16 13:12:51 compute-1 sudo[186071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbctiecjscgrpokyfyhsyudwqdzhlmwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247571.709559-2677-218220825910080/AnsiballZ_podman_container.py'
Feb 16 13:12:51 compute-1 sudo[186071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:12:52 compute-1 python3.9[186073]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 16 13:12:52 compute-1 systemd[1]: Started libpod-conmon-0d2186e649f0a163be73b537090b7cf4c417eb94f2cb14bb5a0d4806d88b58bd.scope.
Feb 16 13:12:52 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:12:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eac917d68c39426130f6585c14fc9764439a03887776fd90bec4c657c6af6c52/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eac917d68c39426130f6585c14fc9764439a03887776fd90bec4c657c6af6c52/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eac917d68c39426130f6585c14fc9764439a03887776fd90bec4c657c6af6c52/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 16 13:12:52 compute-1 podman[186100]: 2026-02-16 13:12:52.447008682 +0000 UTC m=+0.139775750 container init 0d2186e649f0a163be73b537090b7cf4c417eb94f2cb14bb5a0d4806d88b58bd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 16 13:12:52 compute-1 podman[186100]: 2026-02-16 13:12:52.453390637 +0000 UTC m=+0.146157715 container start 0d2186e649f0a163be73b537090b7cf4c417eb94f2cb14bb5a0d4806d88b58bd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=nova_compute_init, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:12:52 compute-1 python3.9[186073]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Applying nova statedir ownership
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 16 13:12:52 compute-1 nova_compute_init[186121]: INFO:nova_statedir:Nova statedir ownership complete
Feb 16 13:12:52 compute-1 systemd[1]: libpod-0d2186e649f0a163be73b537090b7cf4c417eb94f2cb14bb5a0d4806d88b58bd.scope: Deactivated successfully.
Feb 16 13:12:52 compute-1 podman[186135]: 2026-02-16 13:12:52.551599644 +0000 UTC m=+0.031876144 container died 0d2186e649f0a163be73b537090b7cf4c417eb94f2cb14bb5a0d4806d88b58bd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=nova_compute_init, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:12:52 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d2186e649f0a163be73b537090b7cf4c417eb94f2cb14bb5a0d4806d88b58bd-userdata-shm.mount: Deactivated successfully.
Feb 16 13:12:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-eac917d68c39426130f6585c14fc9764439a03887776fd90bec4c657c6af6c52-merged.mount: Deactivated successfully.
Feb 16 13:12:52 compute-1 podman[186135]: 2026-02-16 13:12:52.591738794 +0000 UTC m=+0.072015284 container cleanup 0d2186e649f0a163be73b537090b7cf4c417eb94f2cb14bb5a0d4806d88b58bd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'ca185f35183f0d3afb3434e4f9d860688e7415158fff7aa7252641524b4e66f1'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 16 13:12:52 compute-1 systemd[1]: libpod-conmon-0d2186e649f0a163be73b537090b7cf4c417eb94f2cb14bb5a0d4806d88b58bd.scope: Deactivated successfully.
Feb 16 13:12:52 compute-1 sudo[186071]: pam_unix(sudo:session): session closed for user root
Feb 16 13:12:53 compute-1 sshd-session[161038]: Connection closed by 192.168.122.30 port 36642
Feb 16 13:12:53 compute-1 sshd-session[161035]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:12:53 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Feb 16 13:12:53 compute-1 systemd[1]: session-25.scope: Consumed 1min 30.143s CPU time.
Feb 16 13:12:53 compute-1 systemd-logind[821]: Session 25 logged out. Waiting for processes to exit.
Feb 16 13:12:53 compute-1 systemd-logind[821]: Removed session 25.
Feb 16 13:12:53 compute-1 nova_compute[185910]: 2026-02-16 13:12:53.385 185914 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:53 compute-1 nova_compute[185910]: 2026-02-16 13:12:53.387 185914 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:53 compute-1 nova_compute[185910]: 2026-02-16 13:12:53.388 185914 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 13:12:53 compute-1 nova_compute[185910]: 2026-02-16 13:12:53.388 185914 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 16 13:12:53 compute-1 nova_compute[185910]: 2026-02-16 13:12:53.605 185914 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:12:53 compute-1 nova_compute[185910]: 2026-02-16 13:12:53.620 185914 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:12:53 compute-1 nova_compute[185910]: 2026-02-16 13:12:53.621 185914 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.106 185914 INFO nova.virt.driver [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.203 185914 INFO nova.compute.provider_config [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.215 185914 DEBUG oslo_concurrency.lockutils [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.215 185914 DEBUG oslo_concurrency.lockutils [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.216 185914 DEBUG oslo_concurrency.lockutils [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.216 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.216 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.216 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.217 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.217 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.217 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.217 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.217 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.217 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.217 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.218 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.218 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.218 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.218 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.218 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.218 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.219 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.219 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.219 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.219 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.219 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.219 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.219 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.219 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.220 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.220 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.220 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.220 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.220 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.221 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.221 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.221 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.221 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.221 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.221 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.221 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.222 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.222 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.222 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.222 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.222 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.222 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.222 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.223 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.223 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.223 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.223 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.223 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.223 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.223 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.224 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.224 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.224 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.224 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.224 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.224 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.224 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.225 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.225 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.225 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.225 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.225 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.225 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.225 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.225 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.226 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.226 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.226 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.226 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.226 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.226 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.226 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.226 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.227 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.227 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.227 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.227 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.227 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.227 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.227 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.228 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.228 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.228 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.228 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.228 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.228 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.228 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.229 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.229 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.229 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.229 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.229 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.229 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.229 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.229 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.230 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.230 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.230 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.230 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.230 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.230 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.230 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.230 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.231 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.231 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.231 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.231 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.231 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.231 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.231 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.232 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.232 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.232 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.232 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.232 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.232 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.232 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.232 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.233 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.233 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.233 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.233 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.233 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.233 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.233 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.234 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.234 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.234 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.234 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.234 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.234 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.234 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.234 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.235 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.235 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.235 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.235 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.235 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.235 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.235 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.235 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.236 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.236 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.236 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.236 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.236 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.236 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.236 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.237 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.237 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.237 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.237 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.237 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.237 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.237 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.238 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.238 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.238 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.238 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.238 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.238 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.238 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.238 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.239 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.239 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.239 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.239 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.239 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.239 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.240 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.240 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.240 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.240 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.240 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.240 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.240 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.240 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.241 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.241 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.241 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.241 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.241 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.241 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.241 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.242 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.242 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.242 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.242 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.242 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.242 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.242 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.243 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.243 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.243 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.243 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.243 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.243 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.243 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.244 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.244 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.244 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.244 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.244 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.244 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.244 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.244 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.245 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.245 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.245 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.245 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.245 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.245 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.245 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.246 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.246 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.246 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.246 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.246 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.246 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.246 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.246 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.247 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.247 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.247 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.247 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.247 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.247 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.247 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.248 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.248 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.248 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.248 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.248 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.248 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.248 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.248 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.249 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.249 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.249 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.249 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.249 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.249 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.249 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.250 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.250 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.250 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.250 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.250 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.250 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.250 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.250 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.251 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.251 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.251 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.251 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.251 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.251 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.252 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.252 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.252 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.252 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.252 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.252 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.252 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.252 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.253 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.253 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.253 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.253 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.253 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.253 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.253 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.254 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.254 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.254 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.254 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.254 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.254 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.254 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.255 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.255 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.255 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.255 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.255 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.255 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.255 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.256 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.256 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.256 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.256 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.256 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.256 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.256 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.256 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.257 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.257 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.257 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.257 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.257 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.257 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.257 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.258 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.258 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.258 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.258 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.258 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.258 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.258 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.258 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.259 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.259 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.259 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.259 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.259 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.259 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.259 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.260 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.260 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.260 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.260 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.260 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.260 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.260 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.260 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.261 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.261 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.261 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.261 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.261 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.261 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.261 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.262 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.262 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.262 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.262 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.262 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.262 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.262 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.263 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.263 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.263 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.263 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.263 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.263 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.264 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.264 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.264 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.264 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.264 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.264 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.264 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.265 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.265 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.265 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.265 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.265 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.266 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.266 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.266 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.266 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.266 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.266 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.266 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.267 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.267 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.267 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.267 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.267 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.267 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.267 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.267 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.268 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.268 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.268 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.268 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.268 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.268 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.268 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.269 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.269 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.269 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.269 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.269 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.269 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.269 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.269 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.270 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.270 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.270 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.270 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.270 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.270 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.271 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.271 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.271 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.271 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.271 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.271 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.271 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.272 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.272 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.272 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.272 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.272 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.272 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.272 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.273 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.273 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.273 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.273 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.273 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.273 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.273 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.274 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.274 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.274 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.274 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.274 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.274 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.274 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.275 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.275 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.275 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.275 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.275 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.275 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.275 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.276 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.276 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.276 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.276 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.276 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.276 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.276 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.277 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.277 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.277 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.277 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.277 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.277 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.277 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.278 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.278 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.278 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.278 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.278 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.278 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.278 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.279 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.279 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.279 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.279 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.279 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.279 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.279 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.280 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.280 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.280 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.280 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.280 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.280 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.280 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.281 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.281 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.281 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.281 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.281 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.281 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.281 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.282 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.282 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.282 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.282 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.282 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.282 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.282 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.283 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.283 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.283 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.283 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.283 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.283 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.284 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.284 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.284 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.284 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.284 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.284 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.285 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.285 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.285 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.285 185914 WARNING oslo_config.cfg [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 16 13:12:54 compute-1 nova_compute[185910]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 16 13:12:54 compute-1 nova_compute[185910]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 16 13:12:54 compute-1 nova_compute[185910]: and ``live_migration_inbound_addr`` respectively.
Feb 16 13:12:54 compute-1 nova_compute[185910]: ).  Its value may be silently ignored in the future.
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.285 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.285 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.286 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.286 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.286 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.286 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.286 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.286 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.286 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.287 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.287 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.287 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.287 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.287 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.287 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.287 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.288 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.288 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.288 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.288 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.288 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.288 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.288 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.289 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.289 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.289 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.289 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.289 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.289 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.290 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.290 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.290 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.290 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.290 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.290 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.291 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.291 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.291 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.291 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.291 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.291 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.291 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.292 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.292 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.292 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.292 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.292 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.292 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.292 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.293 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.293 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.293 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.293 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.293 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.293 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.293 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.294 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.294 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.294 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.294 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.294 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.294 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.294 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.295 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.295 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.295 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.295 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.295 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.295 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.295 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.296 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.296 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.296 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.296 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.296 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.296 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.296 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.296 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.297 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.297 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.297 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.297 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.297 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.297 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.298 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.298 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.298 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.298 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.298 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.298 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.298 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.299 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.299 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.299 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.299 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.299 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.299 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.300 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.300 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.300 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.300 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.300 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.300 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.300 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.301 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.301 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.301 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.301 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.301 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.301 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.301 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.302 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.302 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.302 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.302 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.302 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.302 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.302 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.303 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.303 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.303 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.303 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.303 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.303 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.303 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.304 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.304 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.304 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.304 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.304 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.304 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.305 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.305 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.305 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.305 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.305 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.305 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.305 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.306 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.306 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.306 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.306 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.306 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.307 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.307 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.307 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.307 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.307 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.307 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.308 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.308 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.308 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.308 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.308 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.309 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.309 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.309 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.309 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.309 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.309 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.310 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.310 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.310 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.310 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.310 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.311 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.311 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.311 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.311 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.311 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.311 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.312 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.312 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.312 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.312 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.312 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.312 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.313 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.313 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.313 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.313 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.313 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.314 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.314 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.314 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.314 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.314 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.314 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.314 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.315 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.315 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.315 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.315 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.315 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.315 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.315 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.316 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.316 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.316 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.316 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.316 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.316 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.317 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.317 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.317 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.317 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.317 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.317 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.317 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.318 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.318 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.318 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.318 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.318 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.318 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.319 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.319 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.319 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.319 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.319 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.319 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.319 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.320 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.320 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.320 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.320 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.320 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.320 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.320 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.321 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.321 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.321 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.321 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.321 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.321 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.321 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.321 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.322 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.322 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.322 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.322 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.322 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.322 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.322 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.323 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.323 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.323 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.323 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.323 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.323 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.324 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.324 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.324 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.324 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.324 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.324 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.324 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.325 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.325 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.325 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.325 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.325 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.325 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.325 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.326 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.326 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.326 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.326 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.326 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.326 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.326 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.327 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.327 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.327 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.327 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.327 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.327 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.328 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.328 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.328 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.328 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.328 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.328 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.329 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.329 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.329 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.329 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.329 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.329 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.329 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.330 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.330 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.330 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.330 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.330 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.330 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.330 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.331 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.331 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.331 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.331 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.331 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.331 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.332 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.332 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.332 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.332 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.332 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.332 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.333 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.333 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.333 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.333 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.333 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.333 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.333 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.334 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.334 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.334 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.334 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.334 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.334 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.334 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.335 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.335 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.335 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.335 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.335 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.335 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.336 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.336 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.336 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.336 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.336 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.336 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.336 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.337 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.337 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.337 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.337 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.337 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.338 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.338 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.338 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.338 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.338 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.338 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.338 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.339 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.339 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.339 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.339 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.339 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.339 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.339 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.339 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.340 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.340 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.340 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.340 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.340 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.340 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.340 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.341 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.341 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.341 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.341 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.341 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.341 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.341 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.342 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.342 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.342 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.342 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.342 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.342 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.342 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.343 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.343 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.343 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.343 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.343 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.343 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.343 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.344 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.344 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.344 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.344 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.344 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.344 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.344 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.345 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.345 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.345 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.345 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.345 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.345 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.345 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.346 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.346 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.346 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.346 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.346 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.346 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.346 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.347 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.347 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.347 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.347 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.347 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.347 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.347 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.348 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.348 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.348 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.348 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.348 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.348 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.349 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.349 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.349 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.349 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.349 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.349 185914 DEBUG oslo_service.service [None req-41ab19da-3321-480d-b710-b3f5f81dd1b7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.351 185914 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.368 185914 INFO nova.virt.node [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Determined node identity 63898862-3dd6-49b3-9545-63882243296a from /var/lib/nova/compute_id
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.369 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.370 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.370 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.370 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.383 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f711d6bcc40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.386 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f711d6bcc40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.387 185914 INFO nova.virt.libvirt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Connection event '1' reason 'None'
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.393 185914 INFO nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Libvirt host capabilities <capabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]: 
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <host>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <uuid>66cb33d6-ab36-4d79-b328-a95e41812fbd</uuid>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <arch>x86_64</arch>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <microcode version='16777317'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <signature family='23' model='49' stepping='0'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='x2apic'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='tsc-deadline'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='osxsave'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='hypervisor'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='tsc_adjust'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='spec-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='stibp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='arch-capabilities'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='cmp_legacy'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='topoext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='virt-ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='lbrv'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='tsc-scale'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='vmcb-clean'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='pause-filter'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='pfthreshold'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='svme-addr-chk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='rdctl-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='skip-l1dfl-vmentry'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='mds-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature name='pschange-mc-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <pages unit='KiB' size='4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <pages unit='KiB' size='2048'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <pages unit='KiB' size='1048576'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <power_management>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <suspend_mem/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <suspend_disk/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <suspend_hybrid/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </power_management>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <iommu support='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <migration_features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <live/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <uri_transports>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <uri_transport>tcp</uri_transport>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <uri_transport>rdma</uri_transport>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </uri_transports>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </migration_features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <topology>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <cells num='1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <cell id='0'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:           <memory unit='KiB'>7864284</memory>
Feb 16 13:12:54 compute-1 nova_compute[185910]:           <pages unit='KiB' size='4'>1966071</pages>
Feb 16 13:12:54 compute-1 nova_compute[185910]:           <pages unit='KiB' size='2048'>0</pages>
Feb 16 13:12:54 compute-1 nova_compute[185910]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 16 13:12:54 compute-1 nova_compute[185910]:           <distances>
Feb 16 13:12:54 compute-1 nova_compute[185910]:             <sibling id='0' value='10'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:           </distances>
Feb 16 13:12:54 compute-1 nova_compute[185910]:           <cpus num='8'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:           </cpus>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         </cell>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </cells>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </topology>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <cache>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </cache>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <secmodel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model>selinux</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <doi>0</doi>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </secmodel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <secmodel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model>dac</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <doi>0</doi>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </secmodel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </host>
Feb 16 13:12:54 compute-1 nova_compute[185910]: 
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <guest>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <os_type>hvm</os_type>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <arch name='i686'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <wordsize>32</wordsize>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <domain type='qemu'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <domain type='kvm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </arch>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <pae/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <nonpae/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <acpi default='on' toggle='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <apic default='on' toggle='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <cpuselection/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <deviceboot/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <disksnapshot default='on' toggle='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <externalSnapshot/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </guest>
Feb 16 13:12:54 compute-1 nova_compute[185910]: 
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <guest>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <os_type>hvm</os_type>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <arch name='x86_64'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <wordsize>64</wordsize>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <domain type='qemu'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <domain type='kvm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </arch>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <acpi default='on' toggle='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <apic default='on' toggle='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <cpuselection/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <deviceboot/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <disksnapshot default='on' toggle='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <externalSnapshot/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </guest>
Feb 16 13:12:54 compute-1 nova_compute[185910]: 
Feb 16 13:12:54 compute-1 nova_compute[185910]: </capabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]: 
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.401 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.406 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 16 13:12:54 compute-1 nova_compute[185910]: <domainCapabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <domain>kvm</domain>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <arch>i686</arch>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <vcpu max='4096'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <iothreads supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <os supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <enum name='firmware'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <loader supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>rom</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pflash</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='readonly'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>yes</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>no</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='secure'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>no</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </loader>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </os>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>on</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>off</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='maximumMigratable'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>on</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>off</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='succor'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='custom' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='KnightsMill'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='athlon'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='athlon-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='core2duo'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='core2duo-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='coreduo'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='coreduo-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='n270'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='n270-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='phenom'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='phenom-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <memoryBacking supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <enum name='sourceType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>file</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>anonymous</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>memfd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </memoryBacking>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <disk supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='diskDevice'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>disk</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>cdrom</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>floppy</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>lun</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='bus'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>fdc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>scsi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>sata</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <graphics supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vnc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>egl-headless</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dbus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </graphics>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <video supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='modelType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vga</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>cirrus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>none</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>bochs</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ramfb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </video>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <hostdev supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='mode'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>subsystem</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='startupPolicy'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>default</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>mandatory</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>requisite</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>optional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='subsysType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pci</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>scsi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='capsType'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='pciBackend'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </hostdev>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <rng supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>random</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>egd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>builtin</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <filesystem supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='driverType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>path</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>handle</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtiofs</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </filesystem>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <tpm supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tpm-tis</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tpm-crb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>emulator</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>external</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendVersion'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>2.0</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </tpm>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <redirdev supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='bus'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </redirdev>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <channel supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pty</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>unix</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </channel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <crypto supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>qemu</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>builtin</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </crypto>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <interface supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>default</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>passt</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <panic supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>isa</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>hyperv</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </panic>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <console supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>null</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pty</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dev</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>file</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pipe</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>stdio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>udp</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tcp</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>unix</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>qemu-vdagent</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dbus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </console>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <gic supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <genid supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <backup supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <async-teardown supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <s390-pv supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <ps2 supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <tdx supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <sev supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <sgx supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <hyperv supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='features'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>relaxed</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vapic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>spinlocks</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vpindex</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>runtime</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>synic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>stimer</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>reset</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vendor_id</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>frequencies</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>reenlightenment</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tlbflush</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ipi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>avic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>emsr_bitmap</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>xmm_input</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <defaults>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </defaults>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </hyperv>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <launchSecurity supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </features>
Feb 16 13:12:54 compute-1 nova_compute[185910]: </domainCapabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.420 185914 DEBUG nova.virt.libvirt.volume.mount [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.430 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 16 13:12:54 compute-1 nova_compute[185910]: <domainCapabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <domain>kvm</domain>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <arch>i686</arch>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <vcpu max='240'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <iothreads supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <os supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <enum name='firmware'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <loader supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>rom</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pflash</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='readonly'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>yes</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>no</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='secure'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>no</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </loader>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </os>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>on</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>off</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='maximumMigratable'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>on</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>off</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='succor'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='custom' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='KnightsMill'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='athlon'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='athlon-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='core2duo'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='core2duo-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='coreduo'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='coreduo-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='n270'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='n270-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='phenom'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='phenom-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <memoryBacking supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <enum name='sourceType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>file</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>anonymous</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>memfd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </memoryBacking>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <disk supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='diskDevice'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>disk</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>cdrom</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>floppy</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>lun</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='bus'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ide</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>fdc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>scsi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>sata</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <graphics supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vnc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>egl-headless</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dbus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </graphics>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <video supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='modelType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vga</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>cirrus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>none</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>bochs</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ramfb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </video>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <hostdev supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='mode'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>subsystem</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='startupPolicy'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>default</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>mandatory</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>requisite</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>optional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='subsysType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pci</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>scsi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='capsType'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='pciBackend'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </hostdev>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <rng supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>random</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>egd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>builtin</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <filesystem supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='driverType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>path</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>handle</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtiofs</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </filesystem>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <tpm supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tpm-tis</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tpm-crb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>emulator</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>external</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendVersion'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>2.0</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </tpm>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <redirdev supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='bus'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </redirdev>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <channel supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pty</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>unix</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </channel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <crypto supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>qemu</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>builtin</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </crypto>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <interface supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>default</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>passt</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <panic supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>isa</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>hyperv</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </panic>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <console supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>null</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pty</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dev</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>file</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pipe</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>stdio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>udp</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tcp</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>unix</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>qemu-vdagent</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dbus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </console>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <gic supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <genid supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <backup supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <async-teardown supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <s390-pv supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <ps2 supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <tdx supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <sev supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <sgx supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <hyperv supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='features'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>relaxed</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vapic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>spinlocks</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vpindex</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>runtime</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>synic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>stimer</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>reset</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vendor_id</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>frequencies</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>reenlightenment</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tlbflush</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ipi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>avic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>emsr_bitmap</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>xmm_input</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <defaults>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </defaults>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </hyperv>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <launchSecurity supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </features>
Feb 16 13:12:54 compute-1 nova_compute[185910]: </domainCapabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.472 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.477 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 16 13:12:54 compute-1 nova_compute[185910]: <domainCapabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <domain>kvm</domain>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <arch>x86_64</arch>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <vcpu max='4096'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <iothreads supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <os supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <enum name='firmware'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>efi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <loader supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>rom</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pflash</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='readonly'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>yes</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>no</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='secure'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>yes</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>no</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </loader>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </os>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>on</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>off</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='maximumMigratable'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>on</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>off</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='succor'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='custom' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='KnightsMill'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='athlon'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='athlon-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='core2duo'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='core2duo-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='coreduo'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='coreduo-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='n270'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='n270-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='phenom'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='phenom-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <memoryBacking supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <enum name='sourceType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>file</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>anonymous</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>memfd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </memoryBacking>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <disk supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='diskDevice'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>disk</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>cdrom</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>floppy</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>lun</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='bus'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>fdc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>scsi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>sata</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <graphics supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vnc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>egl-headless</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dbus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </graphics>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <video supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='modelType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vga</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>cirrus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>none</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>bochs</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ramfb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </video>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <hostdev supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='mode'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>subsystem</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='startupPolicy'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>default</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>mandatory</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>requisite</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>optional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='subsysType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pci</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>scsi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='capsType'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='pciBackend'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </hostdev>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <rng supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>random</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>egd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>builtin</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <filesystem supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='driverType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>path</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>handle</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtiofs</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </filesystem>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <tpm supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tpm-tis</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tpm-crb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>emulator</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>external</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendVersion'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>2.0</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </tpm>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <redirdev supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='bus'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </redirdev>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <channel supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pty</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>unix</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </channel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <crypto supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>qemu</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>builtin</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </crypto>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <interface supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>default</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>passt</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <panic supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>isa</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>hyperv</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </panic>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <console supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>null</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pty</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dev</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>file</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pipe</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>stdio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>udp</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tcp</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>unix</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>qemu-vdagent</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dbus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </console>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <gic supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <genid supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <backup supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <async-teardown supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <s390-pv supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <ps2 supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <tdx supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <sev supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <sgx supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <hyperv supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='features'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>relaxed</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vapic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>spinlocks</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vpindex</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>runtime</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>synic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>stimer</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>reset</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vendor_id</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>frequencies</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>reenlightenment</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tlbflush</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ipi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>avic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>emsr_bitmap</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>xmm_input</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <defaults>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </defaults>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </hyperv>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <launchSecurity supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </features>
Feb 16 13:12:54 compute-1 nova_compute[185910]: </domainCapabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.554 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 16 13:12:54 compute-1 nova_compute[185910]: <domainCapabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <domain>kvm</domain>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <arch>x86_64</arch>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <vcpu max='240'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <iothreads supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <os supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <enum name='firmware'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <loader supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>rom</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pflash</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='readonly'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>yes</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>no</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='secure'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>no</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </loader>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </os>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='host-passthrough' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='hostPassthroughMigratable'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>on</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>off</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='maximum' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='maximumMigratable'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>on</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>off</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='host-model' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <vendor>AMD</vendor>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='x2apic'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='hypervisor'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='stibp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='overflow-recov'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='succor'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='lbrv'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='tsc-scale'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='flushbyasid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='pause-filter'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='pfthreshold'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <feature policy='disable' name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <mode name='custom' supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Broadwell-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='ClearwaterForest'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='ClearwaterForest-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ddpd-u'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sha512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm3'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sm4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Cooperlake-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Denverton-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Dhyana-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Milan-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Rome-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Turin'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-Turin-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amd-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='auto-ibrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vp2intersect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fs-gs-base-ns'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibpb-brtype'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='no-nested-data-bp'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='null-sel-clr-base'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='perfmon-v2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbpb'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='srso-user-kernel-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='stibp-always-on'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='EPYC-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='GraniteRapids-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-128'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-256'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx10-512'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='prefetchiti'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Haswell-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v6'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Icelake-Server-v7'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='IvyBridge-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='KnightsMill'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='KnightsMill-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4fmaps'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-4vnniw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512er'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512pf'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G4-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Opteron_G5-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fma4'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tbm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xop'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SapphireRapids-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='amx-tile'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-bf16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-fp16'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512-vpopcntdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bitalg'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vbmi2'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrc'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fzrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='la57'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='taa-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='tsx-ldtrk'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='SierraForest-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ifma'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-ne-convert'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx-vnni-int8'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bhi-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='bus-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cmpccxadd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fbsdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='fsrs'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ibrs-all'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='intel-psfd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ipred-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='lam'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mcdt-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pbrsb-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='psdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rrsba-ctrl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='sbdr-ssdp-no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='serialize'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vaes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='vpclmulqdq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Client-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='hle'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='rtm'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Skylake-Server-v5'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512bw'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512cd'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512dq'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512f'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='avx512vl'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='invpcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pcid'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='pku'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='mpx'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v2'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v3'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='core-capability'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='split-lock-detect'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='Snowridge-v4'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='cldemote'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='erms'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='gfni'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdir64b'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='movdiri'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='xsaves'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='athlon'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='athlon-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='core2duo'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='core2duo-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='coreduo'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='coreduo-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='n270'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='n270-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='ss'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='phenom'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <blockers model='phenom-v1'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnow'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <feature name='3dnowext'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </blockers>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </mode>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <memoryBacking supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <enum name='sourceType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>file</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>anonymous</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <value>memfd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </memoryBacking>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <disk supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='diskDevice'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>disk</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>cdrom</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>floppy</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>lun</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='bus'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ide</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>fdc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>scsi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>sata</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <graphics supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vnc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>egl-headless</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dbus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </graphics>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <video supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='modelType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vga</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>cirrus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>none</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>bochs</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ramfb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </video>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <hostdev supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='mode'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>subsystem</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='startupPolicy'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>default</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>mandatory</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>requisite</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>optional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='subsysType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pci</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>scsi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='capsType'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='pciBackend'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </hostdev>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <rng supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtio-non-transitional</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>random</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>egd</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>builtin</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <filesystem supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='driverType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>path</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>handle</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>virtiofs</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </filesystem>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <tpm supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tpm-tis</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tpm-crb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>emulator</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>external</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendVersion'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>2.0</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </tpm>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <redirdev supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='bus'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>usb</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </redirdev>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <channel supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pty</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>unix</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </channel>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <crypto supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>qemu</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendModel'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>builtin</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </crypto>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <interface supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='backendType'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>default</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>passt</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <panic supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='model'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>isa</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>hyperv</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </panic>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <console supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='type'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>null</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vc</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pty</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dev</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>file</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>pipe</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>stdio</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>udp</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tcp</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>unix</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>qemu-vdagent</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>dbus</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </console>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <features>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <gic supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <vmcoreinfo supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <genid supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <backingStoreInput supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <backup supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <async-teardown supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <s390-pv supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <ps2 supported='yes'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <tdx supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <sev supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <sgx supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <hyperv supported='yes'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <enum name='features'>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>relaxed</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vapic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>spinlocks</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vpindex</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>runtime</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>synic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>stimer</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>reset</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>vendor_id</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>frequencies</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>reenlightenment</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>tlbflush</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>ipi</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>avic</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>emsr_bitmap</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <value>xmm_input</value>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </enum>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       <defaults>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <spinlocks>4095</spinlocks>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <stimer_direct>on</stimer_direct>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 13:12:54 compute-1 nova_compute[185910]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 13:12:54 compute-1 nova_compute[185910]:       </defaults>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     </hyperv>
Feb 16 13:12:54 compute-1 nova_compute[185910]:     <launchSecurity supported='no'/>
Feb 16 13:12:54 compute-1 nova_compute[185910]:   </features>
Feb 16 13:12:54 compute-1 nova_compute[185910]: </domainCapabilities>
Feb 16 13:12:54 compute-1 nova_compute[185910]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.633 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.634 185914 INFO nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Secure Boot support detected
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.636 185914 INFO nova.virt.libvirt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.636 185914 INFO nova.virt.libvirt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.645 185914 DEBUG nova.virt.libvirt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] cpu compare xml: <cpu match="exact">
Feb 16 13:12:54 compute-1 nova_compute[185910]:   <model>Nehalem</model>
Feb 16 13:12:54 compute-1 nova_compute[185910]: </cpu>
Feb 16 13:12:54 compute-1 nova_compute[185910]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.649 185914 DEBUG nova.virt.libvirt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.755 185914 INFO nova.virt.node [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Determined node identity 63898862-3dd6-49b3-9545-63882243296a from /var/lib/nova/compute_id
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.805 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Verified node 63898862-3dd6-49b3-9545-63882243296a matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 16 13:12:54 compute-1 nova_compute[185910]: 2026-02-16 13:12:54.848 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 16 13:12:55 compute-1 rsyslogd[1016]: imjournal from <np0005620857:nova_compute>: begin to drop messages due to rate-limiting
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.057 185914 DEBUG oslo_concurrency.lockutils [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.058 185914 DEBUG oslo_concurrency.lockutils [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.058 185914 DEBUG oslo_concurrency.lockutils [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.058 185914 DEBUG nova.compute.resource_tracker [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.215 185914 WARNING nova.virt.libvirt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.216 185914 DEBUG nova.compute.resource_tracker [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6184MB free_disk=73.43770599365234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.216 185914 DEBUG oslo_concurrency.lockutils [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.217 185914 DEBUG oslo_concurrency.lockutils [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.494 185914 DEBUG nova.compute.resource_tracker [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.494 185914 DEBUG nova.compute.resource_tracker [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.606 185914 DEBUG nova.scheduler.client.report [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.636 185914 DEBUG nova.scheduler.client.report [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.636 185914 DEBUG nova.compute.provider_tree [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.658 185914 DEBUG nova.scheduler.client.report [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.708 185914 DEBUG nova.scheduler.client.report [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.769 185914 DEBUG nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 16 13:12:55 compute-1 nova_compute[185910]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.770 185914 INFO nova.virt.libvirt.host [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] kernel doesn't support AMD SEV
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.770 185914 DEBUG nova.compute.provider_tree [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.771 185914 DEBUG nova.virt.libvirt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.773 185914 DEBUG nova.virt.libvirt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Libvirt baseline CPU <cpu>
Feb 16 13:12:55 compute-1 nova_compute[185910]:   <arch>x86_64</arch>
Feb 16 13:12:55 compute-1 nova_compute[185910]:   <model>Nehalem</model>
Feb 16 13:12:55 compute-1 nova_compute[185910]:   <vendor>AMD</vendor>
Feb 16 13:12:55 compute-1 nova_compute[185910]:   <topology sockets="8" cores="1" threads="1"/>
Feb 16 13:12:55 compute-1 nova_compute[185910]: </cpu>
Feb 16 13:12:55 compute-1 nova_compute[185910]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.817 185914 DEBUG nova.scheduler.client.report [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.843 185914 DEBUG nova.compute.resource_tracker [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.843 185914 DEBUG oslo_concurrency.lockutils [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.843 185914 DEBUG nova.service [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.891 185914 DEBUG nova.service [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 16 13:12:55 compute-1 nova_compute[185910]: 2026-02-16 13:12:55.892 185914 DEBUG nova.servicegroup.drivers.db [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 16 13:12:59 compute-1 sshd-session[186210]: Accepted publickey for zuul from 192.168.122.30 port 54678 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:12:59 compute-1 systemd-logind[821]: New session 27 of user zuul.
Feb 16 13:13:00 compute-1 systemd[1]: Started Session 27 of User zuul.
Feb 16 13:13:00 compute-1 sshd-session[186210]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:13:00 compute-1 python3.9[186363]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 13:13:02 compute-1 sudo[186517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slmdptzmxsmsxlvnhuzvxtdzdrgbedsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247581.679576-53-263686963473821/AnsiballZ_systemd_service.py'
Feb 16 13:13:02 compute-1 sudo[186517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:02 compute-1 python3.9[186519]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:13:02 compute-1 systemd[1]: Reloading.
Feb 16 13:13:02 compute-1 systemd-sysv-generator[186551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:02 compute-1 systemd-rc-local-generator[186548]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:02 compute-1 sudo[186517]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:13:03.315 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:13:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:13:03.316 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:13:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:13:03.317 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:13:03 compute-1 python3.9[186712]: ansible-ansible.builtin.service_facts Invoked
Feb 16 13:13:03 compute-1 network[186729]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 13:13:03 compute-1 network[186730]: 'network-scripts' will be removed from distribution in near future.
Feb 16 13:13:03 compute-1 network[186731]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 13:13:06 compute-1 sudo[187002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eybqhesufdovukkihvlarkaxzduvautd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247586.34166-91-160389056394829/AnsiballZ_systemd_service.py'
Feb 16 13:13:06 compute-1 sudo[187002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:06 compute-1 python3.9[187004]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:13:06 compute-1 sudo[187002]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:07 compute-1 sudo[187155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kktkztogtsubcazsclxekyivucbugbye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247587.3532777-111-270807108477215/AnsiballZ_file.py'
Feb 16 13:13:07 compute-1 sudo[187155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:07 compute-1 python3.9[187157]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:07 compute-1 sudo[187155]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:07 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:13:07 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:13:08 compute-1 sudo[187318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyyeveiqicagpazgulxegxjwsypgwqtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247588.165956-127-77098512681228/AnsiballZ_file.py'
Feb 16 13:13:08 compute-1 sudo[187318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:08 compute-1 podman[187282]: 2026-02-16 13:13:08.508317201 +0000 UTC m=+0.079042013 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 16 13:13:08 compute-1 python3.9[187325]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:08 compute-1 sudo[187318]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:09 compute-1 sudo[187480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlqxgfsoutftoeldtudplgmdupictpqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247588.9773693-145-218804385800984/AnsiballZ_command.py'
Feb 16 13:13:09 compute-1 sudo[187480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:09 compute-1 python3.9[187482]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:13:09 compute-1 sudo[187480]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:10 compute-1 python3.9[187634]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:13:10 compute-1 sudo[187784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpuvztkvboiktawptcaanqordsplzosz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247590.5695107-181-277757812034510/AnsiballZ_systemd_service.py'
Feb 16 13:13:10 compute-1 sudo[187784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:11 compute-1 python3.9[187786]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:13:11 compute-1 systemd[1]: Reloading.
Feb 16 13:13:11 compute-1 systemd-sysv-generator[187816]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:11 compute-1 systemd-rc-local-generator[187813]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:11 compute-1 sudo[187784]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:11 compute-1 sudo[187989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvyuntednrkvfcextxmxhqrqlehxxzax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247591.6106174-197-217500764295742/AnsiballZ_command.py'
Feb 16 13:13:11 compute-1 sudo[187989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:11 compute-1 podman[187951]: 2026-02-16 13:13:11.910342694 +0000 UTC m=+0.086688977 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Feb 16 13:13:12 compute-1 python3.9[187999]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:13:12 compute-1 sudo[187989]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:12 compute-1 sudo[188158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbityynecpmqnewfvmsqftavjwutzaki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247592.3056316-215-34087232287202/AnsiballZ_file.py'
Feb 16 13:13:12 compute-1 sudo[188158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:12 compute-1 python3.9[188160]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:12 compute-1 sudo[188158]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:13 compute-1 python3.9[188310]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:14 compute-1 sudo[188462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccelidkoiqxzgdpqwvexnlnvpbauaqxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247593.84152-247-52919786584916/AnsiballZ_group.py'
Feb 16 13:13:14 compute-1 sudo[188462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:14 compute-1 python3.9[188464]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 16 13:13:14 compute-1 sudo[188462]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:15 compute-1 sudo[188614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jssbhmvzcxlqqpvekvseffmlyqtkbzkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247595.0966485-269-72446823823224/AnsiballZ_getent.py'
Feb 16 13:13:15 compute-1 sudo[188614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:15 compute-1 python3.9[188616]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 16 13:13:15 compute-1 sudo[188614]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:16 compute-1 sudo[188767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwukibemljgdluyyygapjfvtbqkwadsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247595.897175-285-232906040902794/AnsiballZ_group.py'
Feb 16 13:13:16 compute-1 sudo[188767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:16 compute-1 python3.9[188769]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 13:13:16 compute-1 groupadd[188770]: group added to /etc/group: name=ceilometer, GID=42405
Feb 16 13:13:16 compute-1 groupadd[188770]: group added to /etc/gshadow: name=ceilometer
Feb 16 13:13:16 compute-1 groupadd[188770]: new group: name=ceilometer, GID=42405
Feb 16 13:13:16 compute-1 sudo[188767]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:17 compute-1 sudo[188925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkwybzvsgojiwgxadbkwwirdknvtjfgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247596.8536203-301-280690252964487/AnsiballZ_user.py'
Feb 16 13:13:17 compute-1 sudo[188925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:17 compute-1 python3.9[188927]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 13:13:17 compute-1 useradd[188929]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 13:13:17 compute-1 useradd[188929]: add 'ceilometer' to group 'libvirt'
Feb 16 13:13:17 compute-1 useradd[188929]: add 'ceilometer' to shadow group 'libvirt'
Feb 16 13:13:17 compute-1 sudo[188925]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:19 compute-1 python3.9[189085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:20 compute-1 python3.9[189206]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771247598.7988057-353-239101664988022/.source.conf _original_basename=ceilometer.conf follow=False checksum=5c6a9288d15d1b05b1484826ce363ad306e9930c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:20 compute-1 python3.9[189356]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:21 compute-1 python3.9[189477]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771247600.3448117-353-65086369193788/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:21 compute-1 python3.9[189627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:22 compute-1 python3.9[189748]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771247601.3450444-353-100502062482940/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:22 compute-1 sshd-session[189873]: Connection closed by 188.166.42.159 port 46916
Feb 16 13:13:23 compute-1 python3.9[189899]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:23 compute-1 python3.9[190051]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:24 compute-1 python3.9[190203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:25 compute-1 python3.9[190324]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247604.1116652-472-76443139852129/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:25 compute-1 python3.9[190474]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:26 compute-1 python3.9[190595]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247605.2040136-472-202867958839055/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:26 compute-1 python3.9[190745]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:27 compute-1 python3.9[190866]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247606.4702196-529-70865946138421/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:28 compute-1 python3.9[191016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:28 compute-1 python3.9[191137]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247607.9679215-561-242457993973789/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:29 compute-1 python3.9[191287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:30 compute-1 python3.9[191408]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247609.1496117-591-90588428887884/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:30 compute-1 python3.9[191558]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:31 compute-1 python3.9[191679]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247610.2441473-621-175299731337105/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:31 compute-1 sudo[191829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqjrmqpcloyscwuirkxjykdtnulbclaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247611.341913-651-169964926028314/AnsiballZ_file.py'
Feb 16 13:13:31 compute-1 sudo[191829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:31 compute-1 python3.9[191831]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:31 compute-1 sudo[191829]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:32 compute-1 sudo[191981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruylbadqgazcfzxlczbaphtrdqbeszcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247612.0104613-667-214945137252388/AnsiballZ_file.py'
Feb 16 13:13:32 compute-1 sudo[191981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:32 compute-1 python3.9[191983]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:32 compute-1 sudo[191981]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:33 compute-1 python3.9[192133]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:33 compute-1 python3.9[192285]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:34 compute-1 python3.9[192437]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:34 compute-1 sudo[192589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ralwowpckdppxbkaeynywlsbgiqwkgkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247614.4466345-731-82544277114524/AnsiballZ_file.py'
Feb 16 13:13:34 compute-1 sudo[192589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:34 compute-1 python3.9[192591]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:34 compute-1 sudo[192589]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:35 compute-1 sudo[192741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfwxkxeobjqaqvzgweoxzlvlbaxmesxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247615.0764997-747-75709487062766/AnsiballZ_systemd_service.py'
Feb 16 13:13:35 compute-1 sudo[192741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:35 compute-1 python3.9[192743]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:13:35 compute-1 systemd[1]: Reloading.
Feb 16 13:13:35 compute-1 systemd-rc-local-generator[192770]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:35 compute-1 systemd-sysv-generator[192774]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:35 compute-1 systemd[1]: Listening on Podman API Socket.
Feb 16 13:13:35 compute-1 sudo[192741]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:36 compute-1 sudo[192940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqnzixjajquqbutzqnqkpkligaetkknf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247616.2075245-765-274407931580731/AnsiballZ_stat.py'
Feb 16 13:13:36 compute-1 sudo[192940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:36 compute-1 python3.9[192942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:36 compute-1 sudo[192940]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:36 compute-1 sudo[193063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-focfpbxrprefefzznjgqfkzlflhheugi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247616.2075245-765-274407931580731/AnsiballZ_copy.py'
Feb 16 13:13:36 compute-1 sudo[193063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:36 compute-1 nova_compute[185910]: 2026-02-16 13:13:36.893 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:36 compute-1 nova_compute[185910]: 2026-02-16 13:13:36.918 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:37 compute-1 python3.9[193065]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247616.2075245-765-274407931580731/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:37 compute-1 sudo[193063]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:37 compute-1 sudo[193215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlgehlprepjxzprmsgjdwyqftujsnpbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247617.7386894-807-199774706757535/AnsiballZ_file.py'
Feb 16 13:13:37 compute-1 sudo[193215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:38 compute-1 python3.9[193217]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:38 compute-1 sudo[193215]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:38 compute-1 sudo[193377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcgpkbpjyztrplvwylbvsdptlllhonqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247618.3691084-823-74663639095925/AnsiballZ_file.py'
Feb 16 13:13:38 compute-1 sudo[193377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:38 compute-1 podman[193341]: 2026-02-16 13:13:38.643741071 +0000 UTC m=+0.046100089 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:13:38 compute-1 python3.9[193387]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:38 compute-1 sudo[193377]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:39 compute-1 python3.9[193537]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:41 compute-1 sudo[193958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnmiqipiyyphykqmhwbwghbgbeawejsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247621.0691175-891-280470800607568/AnsiballZ_container_config_data.py'
Feb 16 13:13:41 compute-1 sudo[193958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:41 compute-1 python3.9[193960]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 16 13:13:41 compute-1 sudo[193958]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:42 compute-1 sudo[194121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtjvphbjgaauenmqijxqlwfgfzleinmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247622.1629324-913-8611523687910/AnsiballZ_container_config_hash.py'
Feb 16 13:13:42 compute-1 sudo[194121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:42 compute-1 podman[194084]: 2026-02-16 13:13:42.606981013 +0000 UTC m=+0.099288628 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Feb 16 13:13:42 compute-1 python3.9[194129]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:13:42 compute-1 sudo[194121]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:43 compute-1 sudo[194286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azqgazayjimkjkrcjpqlotbhhwsyphsb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247623.1386294-933-257003108738656/AnsiballZ_edpm_container_manage.py'
Feb 16 13:13:43 compute-1 sudo[194286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:43 compute-1 python3[194288]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:13:45 compute-1 podman[194301]: 2026-02-16 13:13:45.096749907 +0000 UTC m=+1.136933042 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 16 13:13:45 compute-1 podman[194399]: 2026-02-16 13:13:45.209009651 +0000 UTC m=+0.037296446 container create c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible)
Feb 16 13:13:45 compute-1 podman[194399]: 2026-02-16 13:13:45.189703164 +0000 UTC m=+0.017989969 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 16 13:13:45 compute-1 python3[194288]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Feb 16 13:13:45 compute-1 sudo[194286]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:46 compute-1 sudo[194584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niubjepsdflbsmdmurwmolejenwwmigb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247626.545985-949-80711415736890/AnsiballZ_stat.py'
Feb 16 13:13:46 compute-1 sudo[194584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:47 compute-1 python3.9[194586]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:47 compute-1 sudo[194584]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:47 compute-1 sudo[194738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvoqyiwrpkfvdfigpnhgadwtcaybqhnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247627.3264623-967-257621654734733/AnsiballZ_file.py'
Feb 16 13:13:47 compute-1 sudo[194738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:47 compute-1 python3.9[194740]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:47 compute-1 sudo[194738]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:47 compute-1 sudo[194814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dapbwudgfuzlqsvqktgiaspbtavtrxza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247627.3264623-967-257621654734733/AnsiballZ_stat.py'
Feb 16 13:13:47 compute-1 sudo[194814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:48 compute-1 python3.9[194816]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:13:48 compute-1 sudo[194814]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:48 compute-1 sudo[194965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovbhegdpdorbqjqytecgztdgjjvfzffr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247628.1916583-967-209150258353696/AnsiballZ_copy.py'
Feb 16 13:13:48 compute-1 sudo[194965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:48 compute-1 python3.9[194967]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247628.1916583-967-209150258353696/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:48 compute-1 sudo[194965]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:49 compute-1 sudo[195043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xofuzpeawozbcgvoplpmdzupwxifuuuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247628.1916583-967-209150258353696/AnsiballZ_systemd.py'
Feb 16 13:13:49 compute-1 sudo[195043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:49 compute-1 sshd-session[194968]: Connection closed by authenticating user root 146.190.226.24 port 39234 [preauth]
Feb 16 13:13:49 compute-1 python3.9[195045]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:13:49 compute-1 systemd[1]: Reloading.
Feb 16 13:13:49 compute-1 systemd-rc-local-generator[195075]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:49 compute-1 systemd-sysv-generator[195078]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:50 compute-1 sudo[195043]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:50 compute-1 sudo[195161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhufepxmwfklukgjqvigbhdfuppjmuyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247628.1916583-967-209150258353696/AnsiballZ_systemd.py'
Feb 16 13:13:50 compute-1 sudo[195161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:50 compute-1 python3.9[195163]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:13:50 compute-1 systemd[1]: Reloading.
Feb 16 13:13:50 compute-1 systemd-rc-local-generator[195193]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:13:50 compute-1 systemd-sysv-generator[195196]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:13:50 compute-1 systemd[1]: Starting podman_exporter container...
Feb 16 13:13:50 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:13:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a63b478f861779fd27e1e915cff20dac2f0e34831384e6d1b78d553d9ca8849f/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 16 13:13:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a63b478f861779fd27e1e915cff20dac2f0e34831384e6d1b78d553d9ca8849f/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 16 13:13:51 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41.
Feb 16 13:13:51 compute-1 podman[195209]: 2026-02-16 13:13:51.007275689 +0000 UTC m=+0.094767784 container init c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:13:51 compute-1 podman_exporter[195225]: ts=2026-02-16T13:13:51.020Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 16 13:13:51 compute-1 podman_exporter[195225]: ts=2026-02-16T13:13:51.020Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 16 13:13:51 compute-1 podman_exporter[195225]: ts=2026-02-16T13:13:51.020Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 16 13:13:51 compute-1 podman_exporter[195225]: ts=2026-02-16T13:13:51.020Z caller=handler.go:105 level=info collector=container
Feb 16 13:13:51 compute-1 podman[195209]: 2026-02-16 13:13:51.037001607 +0000 UTC m=+0.124493692 container start c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:13:51 compute-1 podman[195209]: podman_exporter
Feb 16 13:13:51 compute-1 systemd[1]: Starting Podman API Service...
Feb 16 13:13:51 compute-1 systemd[1]: Started Podman API Service.
Feb 16 13:13:51 compute-1 systemd[1]: Started podman_exporter container.
Feb 16 13:13:51 compute-1 podman[195236]: time="2026-02-16T13:13:51Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 16 13:13:51 compute-1 podman[195236]: time="2026-02-16T13:13:51Z" level=info msg="Setting parallel job count to 25"
Feb 16 13:13:51 compute-1 podman[195236]: time="2026-02-16T13:13:51Z" level=info msg="Using sqlite as database backend"
Feb 16 13:13:51 compute-1 podman[195236]: time="2026-02-16T13:13:51Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 16 13:13:51 compute-1 podman[195236]: time="2026-02-16T13:13:51Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 16 13:13:51 compute-1 podman[195236]: time="2026-02-16T13:13:51Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Feb 16 13:13:51 compute-1 podman[195236]: @ - - [16/Feb/2026:13:13:51 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 16 13:13:51 compute-1 podman[195236]: time="2026-02-16T13:13:51Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:13:51 compute-1 sudo[195161]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:51 compute-1 podman[195236]: @ - - [16/Feb/2026:13:13:51 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 12585 "" "Go-http-client/1.1"
Feb 16 13:13:51 compute-1 podman_exporter[195225]: ts=2026-02-16T13:13:51.101Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 16 13:13:51 compute-1 podman_exporter[195225]: ts=2026-02-16T13:13:51.101Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 16 13:13:51 compute-1 podman_exporter[195225]: ts=2026-02-16T13:13:51.102Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Feb 16 13:13:51 compute-1 podman[195234]: 2026-02-16 13:13:51.128478191 +0000 UTC m=+0.081222308 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:13:51 compute-1 systemd[1]: c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41-33b42f36d5543ed4.service: Main process exited, code=exited, status=1/FAILURE
Feb 16 13:13:51 compute-1 systemd[1]: c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41-33b42f36d5543ed4.service: Failed with result 'exit-code'.
Feb 16 13:13:52 compute-1 python3.9[195423]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:13:52 compute-1 rsyslogd[1016]: imjournal: 1870 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 16 13:13:52 compute-1 sudo[195573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqvhjkwalwqvpkrashspdyvcwinbqpeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247632.7824342-1057-269971490318777/AnsiballZ_stat.py'
Feb 16 13:13:52 compute-1 sudo[195573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:53 compute-1 python3.9[195575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:53 compute-1 sudo[195573]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:53 compute-1 sudo[195698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzsazxiawhixjapkbdysrhmemrewhwbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247632.7824342-1057-269971490318777/AnsiballZ_copy.py'
Feb 16 13:13:53 compute-1 sudo[195698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.634 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.634 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.634 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.655 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.656 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.656 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.656 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.657 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.657 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.657 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.658 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.658 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:13:53 compute-1 python3.9[195700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247632.7824342-1057-269971490318777/.source.yaml _original_basename=.kzjl244p follow=False checksum=280f1141251475cb4d34033a34a96ef19e265de1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:53 compute-1 sudo[195698]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.691 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.692 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.692 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.692 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.807 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.808 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6045MB free_disk=73.38666534423828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.808 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.808 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.898 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.898 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.923 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.946 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.947 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:13:53 compute-1 nova_compute[185910]: 2026-02-16 13:13:53.947 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:13:54 compute-1 sudo[195850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwdisiyoqfjbxtvlmjnjvmqkagweozsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247633.9459853-1087-7742070891286/AnsiballZ_stat.py'
Feb 16 13:13:54 compute-1 sudo[195850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:54 compute-1 python3.9[195852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:13:54 compute-1 sudo[195850]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:54 compute-1 sudo[195973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmfxmkghcqzvlclbfxjckjgfhgvjppcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247633.9459853-1087-7742070891286/AnsiballZ_copy.py'
Feb 16 13:13:54 compute-1 sudo[195973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:54 compute-1 python3.9[195975]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771247633.9459853-1087-7742070891286/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:54 compute-1 sudo[195973]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:55 compute-1 sudo[196125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybreugnedhoxgiugtrvzylzeedrtjclc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247635.523809-1129-110657320443538/AnsiballZ_file.py'
Feb 16 13:13:55 compute-1 sudo[196125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:55 compute-1 python3.9[196127]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:55 compute-1 sudo[196125]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:56 compute-1 sudo[196277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nublksyodibcjzgynotxqaszacosmews ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247636.1666508-1145-137166675823894/AnsiballZ_file.py'
Feb 16 13:13:56 compute-1 sudo[196277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:56 compute-1 python3.9[196279]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 13:13:56 compute-1 sudo[196277]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:57 compute-1 python3.9[196429]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:13:58 compute-1 sudo[196850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpmkxiceafmvtpaebjcjykpozdsqoirz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247638.7132483-1213-261208966915919/AnsiballZ_container_config_data.py'
Feb 16 13:13:58 compute-1 sudo[196850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:13:59 compute-1 python3.9[196852]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 16 13:13:59 compute-1 sudo[196850]: pam_unix(sudo:session): session closed for user root
Feb 16 13:13:59 compute-1 sudo[197002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afvckahebazlhrbiairfayleydgjfivn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247639.6258767-1235-232924260853311/AnsiballZ_container_config_hash.py'
Feb 16 13:13:59 compute-1 sudo[197002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:00 compute-1 python3.9[197004]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 13:14:00 compute-1 sudo[197002]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:00 compute-1 sudo[197154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kirhfzwzbhzsuskkqksclnybofpsmhog ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247640.441539-1255-28323876032444/AnsiballZ_edpm_container_manage.py'
Feb 16 13:14:00 compute-1 sudo[197154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:00 compute-1 python3[197156]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 13:14:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:14:03.317 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:14:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:14:03.318 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:14:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:14:03.318 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:14:03 compute-1 podman[197170]: 2026-02-16 13:14:03.362827605 +0000 UTC m=+2.362821630 image pull 8da9a5cb84d98cc9d82bfbfe59b1a8f3d35b219d7fadc752f19c50c8fa4c9c58 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 16 13:14:03 compute-1 podman[197270]: 2026-02-16 13:14:03.503307314 +0000 UTC m=+0.057695556 container create 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, container_name=openstack_network_exporter, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, version=9.7, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Feb 16 13:14:03 compute-1 podman[197270]: 2026-02-16 13:14:03.473526204 +0000 UTC m=+0.027914476 image pull 8da9a5cb84d98cc9d82bfbfe59b1a8f3d35b219d7fadc752f19c50c8fa4c9c58 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 16 13:14:03 compute-1 python3[197156]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 16 13:14:03 compute-1 sudo[197154]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:04 compute-1 sudo[197458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cillduayxjblmbwhtxqegvmuqnpaysyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247643.8211086-1271-116017432588392/AnsiballZ_stat.py'
Feb 16 13:14:04 compute-1 sudo[197458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:04 compute-1 python3.9[197460]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:14:04 compute-1 sudo[197458]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:04 compute-1 sudo[197612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfwnmjprtwzijwposgjjucfebhhgbdcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247644.5710406-1289-173025159744505/AnsiballZ_file.py'
Feb 16 13:14:04 compute-1 sudo[197612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:05 compute-1 python3.9[197614]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:05 compute-1 sudo[197612]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:05 compute-1 sudo[197688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uugsgnlrkewwsjyrpskdnngoxmfapuuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247644.5710406-1289-173025159744505/AnsiballZ_stat.py'
Feb 16 13:14:05 compute-1 sudo[197688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:05 compute-1 python3.9[197690]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:14:05 compute-1 sudo[197688]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:05 compute-1 sudo[197839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbpcboqrlzyoisurrpsncplgyxnphsvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247645.477726-1289-243893997207345/AnsiballZ_copy.py'
Feb 16 13:14:05 compute-1 sudo[197839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:06 compute-1 python3.9[197841]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771247645.477726-1289-243893997207345/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:06 compute-1 sudo[197839]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:06 compute-1 sudo[197915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptjvwjnxqerjzlzaehapqpfjqoushbzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247645.477726-1289-243893997207345/AnsiballZ_systemd.py'
Feb 16 13:14:06 compute-1 sudo[197915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:06 compute-1 python3.9[197917]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 13:14:06 compute-1 systemd[1]: Reloading.
Feb 16 13:14:06 compute-1 systemd-sysv-generator[197948]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:14:06 compute-1 systemd-rc-local-generator[197942]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:14:06 compute-1 sudo[197915]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:07 compute-1 sudo[198032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvwvgrqtmjdwrvfqppkfoqlnfduzkbvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247645.477726-1289-243893997207345/AnsiballZ_systemd.py'
Feb 16 13:14:07 compute-1 sudo[198032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:07 compute-1 python3.9[198034]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 13:14:07 compute-1 systemd[1]: Reloading.
Feb 16 13:14:07 compute-1 systemd-sysv-generator[198068]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 13:14:07 compute-1 systemd-rc-local-generator[198062]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 13:14:07 compute-1 systemd[1]: Starting openstack_network_exporter container...
Feb 16 13:14:07 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:14:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870395d6ab380064b86a295d820dd3ede6d02dfbb6109d26e2197018e2c69a38/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 16 13:14:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870395d6ab380064b86a295d820dd3ede6d02dfbb6109d26e2197018e2c69a38/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 16 13:14:07 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870395d6ab380064b86a295d820dd3ede6d02dfbb6109d26e2197018e2c69a38/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 16 13:14:07 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935.
Feb 16 13:14:07 compute-1 podman[198080]: 2026-02-16 13:14:07.988148525 +0000 UTC m=+0.136017486 container init 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, version=9.7, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., release=1770267347, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:48: registering *bridge.Collector
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:48: registering *coverage.Collector
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:48: registering *datapath.Collector
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:48: registering *iface.Collector
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:48: registering *memory.Collector
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:48: registering *ovn.Collector
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:48: registering *pmd_perf.Collector
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:48: registering *pmd_rxq.Collector
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: INFO    13:14:08 main.go:48: registering *vswitch.Collector
Feb 16 13:14:08 compute-1 openstack_network_exporter[198096]: NOTICE  13:14:08 main.go:76: listening on https://:9105/metrics
Feb 16 13:14:08 compute-1 podman[198080]: 2026-02-16 13:14:08.009829533 +0000 UTC m=+0.157698504 container start 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Feb 16 13:14:08 compute-1 podman[198080]: openstack_network_exporter
Feb 16 13:14:08 compute-1 systemd[1]: Started openstack_network_exporter container.
Feb 16 13:14:08 compute-1 sudo[198032]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:08 compute-1 podman[198106]: 2026-02-16 13:14:08.085901637 +0000 UTC m=+0.066974563 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347)
Feb 16 13:14:08 compute-1 podman[198251]: 2026-02-16 13:14:08.811669748 +0000 UTC m=+0.042785500 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:14:08 compute-1 python3.9[198288]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 13:14:08 compute-1 auditd[720]: Audit daemon rotating log files
Feb 16 13:14:09 compute-1 sudo[198444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epcuangebnsstzvvoackzzvzsuehhxqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247649.6367037-1379-173003169315564/AnsiballZ_stat.py'
Feb 16 13:14:09 compute-1 sudo[198444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:10 compute-1 python3.9[198446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:10 compute-1 sudo[198444]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:10 compute-1 sudo[198569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlgdljymolhueeftdzauirtogmhcflkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247649.6367037-1379-173003169315564/AnsiballZ_copy.py'
Feb 16 13:14:10 compute-1 sudo[198569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:10 compute-1 python3.9[198571]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247649.6367037-1379-173003169315564/.source.yaml _original_basename=.32pilwk_ follow=False checksum=6d20d35f7d87354f9b20be3862dc4377478a0a27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:10 compute-1 sudo[198569]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:11 compute-1 sudo[198721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkwhswhsarjckayxhmelpqwihrnfjlrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247650.7807322-1409-144446299673007/AnsiballZ_find.py'
Feb 16 13:14:11 compute-1 sudo[198721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:11 compute-1 python3.9[198723]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 13:14:11 compute-1 sudo[198721]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:12 compute-1 podman[198748]: 2026-02-16 13:14:12.994966368 +0000 UTC m=+0.134186582 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:14:21 compute-1 podman[198775]: 2026-02-16 13:14:21.91014507 +0000 UTC m=+0.053773648 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:14:23 compute-1 sudo[198924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygqiterhdfvbcxjxgfuswrvseqonrknb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247663.4352522-1559-137546374850615/AnsiballZ_podman_container_info.py'
Feb 16 13:14:23 compute-1 sudo[198924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:23 compute-1 python3.9[198926]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 16 13:14:23 compute-1 sudo[198924]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:24 compute-1 sudo[199089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aotakpzrebggqnxlptcfyqlghglxuizj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247664.1108992-1567-109139126387640/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:24 compute-1 sudo[199089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:24 compute-1 python3.9[199091]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:24 compute-1 systemd[1]: Started libpod-conmon-6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1.scope.
Feb 16 13:14:24 compute-1 podman[199092]: 2026-02-16 13:14:24.638766532 +0000 UTC m=+0.087253349 container exec 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:14:24 compute-1 podman[199092]: 2026-02-16 13:14:24.670168761 +0000 UTC m=+0.118655568 container exec_died 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:14:24 compute-1 systemd[1]: libpod-conmon-6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1.scope: Deactivated successfully.
Feb 16 13:14:24 compute-1 sudo[199089]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:25 compute-1 sudo[199272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auqqpejhzxtrrkrtgalqbebulrumsbtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247664.865465-1575-167451879616457/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:25 compute-1 sudo[199272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:25 compute-1 python3.9[199274]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:25 compute-1 systemd[1]: Started libpod-conmon-6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1.scope.
Feb 16 13:14:25 compute-1 podman[199275]: 2026-02-16 13:14:25.382987705 +0000 UTC m=+0.077889049 container exec 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 16 13:14:25 compute-1 podman[199294]: 2026-02-16 13:14:25.44536137 +0000 UTC m=+0.052142730 container exec_died 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 16 13:14:25 compute-1 podman[199275]: 2026-02-16 13:14:25.450954617 +0000 UTC m=+0.145855961 container exec_died 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:14:25 compute-1 systemd[1]: libpod-conmon-6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1.scope: Deactivated successfully.
Feb 16 13:14:25 compute-1 sudo[199272]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:25 compute-1 sudo[199456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyuonpibpyvbhiswxbfzpktpzdzoqjdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247665.6366594-1583-133731344644560/AnsiballZ_file.py'
Feb 16 13:14:25 compute-1 sudo[199456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:26 compute-1 python3.9[199458]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:26 compute-1 sudo[199456]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:26 compute-1 sudo[199608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktsdedqtyqvatkdqfwduafdejviatabl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247666.286531-1592-281334283952712/AnsiballZ_podman_container_info.py'
Feb 16 13:14:26 compute-1 sudo[199608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:26 compute-1 python3.9[199610]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 16 13:14:26 compute-1 sudo[199608]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:27 compute-1 sudo[199773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omhsmemvaiboaoivpavosldvnwtsstec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247666.913977-1600-47965879756625/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:27 compute-1 sudo[199773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:27 compute-1 python3.9[199775]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:27 compute-1 systemd[1]: Started libpod-conmon-6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879.scope.
Feb 16 13:14:27 compute-1 podman[199776]: 2026-02-16 13:14:27.396823533 +0000 UTC m=+0.064043895 container exec 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:14:27 compute-1 podman[199776]: 2026-02-16 13:14:27.430586082 +0000 UTC m=+0.097806444 container exec_died 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:14:27 compute-1 systemd[1]: libpod-conmon-6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879.scope: Deactivated successfully.
Feb 16 13:14:27 compute-1 sudo[199773]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:27 compute-1 sudo[199957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyuzvaeapsjjvdejdnhruicoliqzcaru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247667.6026323-1608-218244214675525/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:27 compute-1 sudo[199957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:28 compute-1 python3.9[199959]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:28 compute-1 systemd[1]: Started libpod-conmon-6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879.scope.
Feb 16 13:14:28 compute-1 podman[199960]: 2026-02-16 13:14:28.097810963 +0000 UTC m=+0.077533367 container exec 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 16 13:14:28 compute-1 podman[199960]: 2026-02-16 13:14:28.127705326 +0000 UTC m=+0.107427740 container exec_died 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 13:14:28 compute-1 systemd[1]: libpod-conmon-6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879.scope: Deactivated successfully.
Feb 16 13:14:28 compute-1 sudo[199957]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:28 compute-1 sudo[200139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brqxbckfczdeyxwljayhdmvdyrxdnjti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247668.326953-1616-46167057669115/AnsiballZ_file.py'
Feb 16 13:14:28 compute-1 sudo[200139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:28 compute-1 python3.9[200141]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:28 compute-1 sudo[200139]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:29 compute-1 sudo[200291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arkmdfafngvvhqvavdpndvmevxaywwvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247668.9553616-1625-89229707577428/AnsiballZ_podman_container_info.py'
Feb 16 13:14:29 compute-1 sudo[200291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:29 compute-1 python3.9[200293]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 16 13:14:29 compute-1 sudo[200291]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:29 compute-1 sudo[200456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqoxaylyodsmdjmjjjvgkdulhmbgjfik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247669.5640566-1633-23767286262677/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:29 compute-1 sudo[200456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:29 compute-1 python3.9[200458]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:30 compute-1 systemd[1]: Started libpod-conmon-c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41.scope.
Feb 16 13:14:30 compute-1 podman[200459]: 2026-02-16 13:14:30.073658156 +0000 UTC m=+0.064388445 container exec c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:14:30 compute-1 podman[200459]: 2026-02-16 13:14:30.105078516 +0000 UTC m=+0.095808775 container exec_died c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:14:30 compute-1 systemd[1]: libpod-conmon-c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41.scope: Deactivated successfully.
Feb 16 13:14:30 compute-1 sudo[200456]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:30 compute-1 sudo[200640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zchbihdjzraxcuraghycpusjdhsvgebs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247670.3240523-1641-78704520969071/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:30 compute-1 sudo[200640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:30 compute-1 python3.9[200642]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:30 compute-1 systemd[1]: Started libpod-conmon-c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41.scope.
Feb 16 13:14:30 compute-1 podman[200643]: 2026-02-16 13:14:30.834769204 +0000 UTC m=+0.070105176 container exec c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:14:30 compute-1 podman[200662]: 2026-02-16 13:14:30.892237582 +0000 UTC m=+0.048775139 container exec_died c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:14:30 compute-1 podman[200643]: 2026-02-16 13:14:30.899091437 +0000 UTC m=+0.134427399 container exec_died c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:14:30 compute-1 systemd[1]: libpod-conmon-c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41.scope: Deactivated successfully.
Feb 16 13:14:30 compute-1 sudo[200640]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:31 compute-1 sudo[200824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aatfxzlunnuwwamrepqhkwwzdriemtqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247671.09644-1649-156860960680658/AnsiballZ_file.py'
Feb 16 13:14:31 compute-1 sudo[200824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:31 compute-1 python3.9[200826]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:31 compute-1 sudo[200824]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:31 compute-1 sudo[200976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypynzfmplovnhgwhxzzpqxzhwoyimocc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247671.717217-1658-262016116011279/AnsiballZ_podman_container_info.py'
Feb 16 13:14:31 compute-1 sudo[200976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:32 compute-1 python3.9[200978]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 16 13:14:32 compute-1 sudo[200976]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:32 compute-1 sudo[201143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqecolcwgiehovhdmerhqvglwamtsfpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247672.3516662-1666-68043087275408/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:32 compute-1 sudo[201143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:32 compute-1 python3.9[201145]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:32 compute-1 systemd[1]: Started libpod-conmon-63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935.scope.
Feb 16 13:14:32 compute-1 podman[201146]: 2026-02-16 13:14:32.868280931 +0000 UTC m=+0.072011473 container exec 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, managed_by=edpm_ansible, release=1770267347, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:14:32 compute-1 podman[201165]: 2026-02-16 13:14:32.930230842 +0000 UTC m=+0.052406577 container exec_died 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vcs-type=git, version=9.7, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1770267347, com.redhat.component=ubi9-minimal-container)
Feb 16 13:14:32 compute-1 podman[201146]: 2026-02-16 13:14:32.934875621 +0000 UTC m=+0.138606133 container exec_died 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.7, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 16 13:14:32 compute-1 systemd[1]: libpod-conmon-63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935.scope: Deactivated successfully.
Feb 16 13:14:32 compute-1 sudo[201143]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:33 compute-1 sudo[201327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syuijhfuvhckskawvyxmfrrkppbmawau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247673.1129997-1674-5783621381030/AnsiballZ_podman_container_exec.py'
Feb 16 13:14:33 compute-1 sudo[201327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:33 compute-1 python3.9[201329]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 13:14:33 compute-1 systemd[1]: Started libpod-conmon-63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935.scope.
Feb 16 13:14:33 compute-1 podman[201330]: 2026-02-16 13:14:33.606120453 +0000 UTC m=+0.075432295 container exec 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 16 13:14:33 compute-1 podman[201330]: 2026-02-16 13:14:33.640435819 +0000 UTC m=+0.109747671 container exec_died 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Feb 16 13:14:33 compute-1 systemd[1]: libpod-conmon-63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935.scope: Deactivated successfully.
Feb 16 13:14:33 compute-1 sudo[201327]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:34 compute-1 sudo[201512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjrdiufpcbzowevdidexeqyuktloiall ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247673.8239458-1682-133021148373158/AnsiballZ_file.py'
Feb 16 13:14:34 compute-1 sudo[201512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:34 compute-1 python3.9[201514]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:34 compute-1 sudo[201512]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:34 compute-1 sudo[201664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdsvlxgbpykrhqhtmqcnbompocyjoyer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247674.4990807-1693-18417598355300/AnsiballZ_file.py'
Feb 16 13:14:34 compute-1 sudo[201664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:34 compute-1 python3.9[201666]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:34 compute-1 sudo[201664]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:35 compute-1 sudo[201816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rymyjenfszjbpxdlbeeermfxhedmebfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247675.1115143-1709-108563631675850/AnsiballZ_stat.py'
Feb 16 13:14:35 compute-1 sudo[201816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:35 compute-1 python3.9[201818]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:35 compute-1 sudo[201816]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:35 compute-1 sudo[201939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzkgwgkcbeqmjzueckpvmefcmxkblgtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247675.1115143-1709-108563631675850/AnsiballZ_copy.py'
Feb 16 13:14:35 compute-1 sudo[201939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:35 compute-1 python3.9[201941]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771247675.1115143-1709-108563631675850/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:36 compute-1 sudo[201939]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:36 compute-1 sudo[202091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vswzdrqrtwkhgchbvdmljtfndjilqafc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247676.2900145-1741-249805456402874/AnsiballZ_file.py'
Feb 16 13:14:36 compute-1 sudo[202091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:36 compute-1 python3.9[202093]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:36 compute-1 sudo[202091]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:37 compute-1 sudo[202243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtxgyihnrhfrmnwwtvszpfmsfyhfdqcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247676.9187665-1757-38144010750291/AnsiballZ_stat.py'
Feb 16 13:14:37 compute-1 sudo[202243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:37 compute-1 python3.9[202245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:37 compute-1 sudo[202243]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:37 compute-1 sudo[202321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfqtfadvfpywvrrztsadacaitmcixlsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247676.9187665-1757-38144010750291/AnsiballZ_file.py'
Feb 16 13:14:37 compute-1 sudo[202321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:37 compute-1 python3.9[202323]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:37 compute-1 sudo[202321]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:38 compute-1 sudo[202485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blyrxyvlvojytimobeobfcqdlfzinqme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247678.0549412-1781-50808516788532/AnsiballZ_stat.py'
Feb 16 13:14:38 compute-1 sudo[202485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:38 compute-1 podman[202447]: 2026-02-16 13:14:38.333141123 +0000 UTC m=+0.060688905 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.7, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, io.openshift.expose-services=)
Feb 16 13:14:38 compute-1 podman[202497]: 2026-02-16 13:14:38.906962322 +0000 UTC m=+0.048997425 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:14:38 compute-1 python3.9[202491]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:38 compute-1 sudo[202485]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:39 compute-1 sudo[202593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hniegepnmscpcrohtfihsaldaaykbqkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247678.0549412-1781-50808516788532/AnsiballZ_file.py'
Feb 16 13:14:39 compute-1 sudo[202593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:39 compute-1 python3.9[202595]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.z2g7h4t9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:39 compute-1 sudo[202593]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:40 compute-1 sudo[202745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxwnoludejuuyclxkwnsfhjgdgpwjpwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247679.8416474-1805-17475390660089/AnsiballZ_stat.py'
Feb 16 13:14:40 compute-1 sudo[202745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:40 compute-1 python3.9[202747]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:40 compute-1 sudo[202745]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:40 compute-1 sudo[202823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pryedvrubuzulogjallefmoaagtlekcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247679.8416474-1805-17475390660089/AnsiballZ_file.py'
Feb 16 13:14:40 compute-1 sudo[202823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:40 compute-1 python3.9[202825]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:40 compute-1 sudo[202823]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:41 compute-1 sudo[202975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jknzokxleasoqsmiwmhmjzhwiuqkhzaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247680.8796988-1831-100278007077122/AnsiballZ_command.py'
Feb 16 13:14:41 compute-1 sudo[202975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:41 compute-1 python3.9[202977]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:14:41 compute-1 sudo[202975]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:41 compute-1 sudo[203128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymtlgfcamrxvlxosrreaivhskybmaagi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771247681.5124002-1847-161166846311816/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 13:14:41 compute-1 sudo[203128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:42 compute-1 python3[203130]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 13:14:42 compute-1 sudo[203128]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:42 compute-1 sudo[203282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddzhdtcgvcaiybyisriqelqecwtnrqrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247682.2927856-1863-135246675233769/AnsiballZ_stat.py'
Feb 16 13:14:42 compute-1 sudo[203282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:42 compute-1 python3.9[203284]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:42 compute-1 sudo[203282]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:42 compute-1 sshd-session[203131]: Connection closed by authenticating user root 188.166.42.159 port 40228 [preauth]
Feb 16 13:14:42 compute-1 sudo[203360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrjoeyimvlfkusbwyyazkrchignfdqsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247682.2927856-1863-135246675233769/AnsiballZ_file.py'
Feb 16 13:14:42 compute-1 sudo[203360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:43 compute-1 python3.9[203362]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:43 compute-1 sudo[203360]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:43 compute-1 sudo[203523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlyqxlxayhbumjtudgtgmpldofvkisqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247683.3689818-1887-6837907879305/AnsiballZ_stat.py'
Feb 16 13:14:43 compute-1 sudo[203523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:43 compute-1 podman[203486]: 2026-02-16 13:14:43.67330273 +0000 UTC m=+0.067473695 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller)
Feb 16 13:14:43 compute-1 python3.9[203529]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:43 compute-1 sudo[203523]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:44 compute-1 sudo[203616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odngoqoxysalobomqfmvplwkxowpruow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247683.3689818-1887-6837907879305/AnsiballZ_file.py'
Feb 16 13:14:44 compute-1 sudo[203616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:44 compute-1 python3.9[203618]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:44 compute-1 sudo[203616]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:44 compute-1 sudo[203768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbeuvfptnacgipuiqsdqjgshidlscqvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247684.45627-1911-161525786030428/AnsiballZ_stat.py'
Feb 16 13:14:44 compute-1 sudo[203768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:44 compute-1 python3.9[203770]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:44 compute-1 sudo[203768]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:45 compute-1 sudo[203846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upqllipttezxzsjjsgwscrnpknwxnosw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247684.45627-1911-161525786030428/AnsiballZ_file.py'
Feb 16 13:14:45 compute-1 sudo[203846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:45 compute-1 python3.9[203848]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:45 compute-1 sudo[203846]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:45 compute-1 sudo[203998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxvamxblqpsbgaqvxrdrexygvkpachuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247685.4653454-1935-179284117248228/AnsiballZ_stat.py'
Feb 16 13:14:45 compute-1 sudo[203998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:45 compute-1 python3.9[204000]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:45 compute-1 sudo[203998]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:46 compute-1 sudo[204076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkuytaaawbwkmeencvckytycsmnyqdme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247685.4653454-1935-179284117248228/AnsiballZ_file.py'
Feb 16 13:14:46 compute-1 sudo[204076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:46 compute-1 python3.9[204078]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:46 compute-1 sudo[204076]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:46 compute-1 sudo[204228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrznkgtpfyfowloexrjgpegadynbmzrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247686.5419865-1959-140381412498848/AnsiballZ_stat.py'
Feb 16 13:14:46 compute-1 sudo[204228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:47 compute-1 python3.9[204230]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 13:14:47 compute-1 sudo[204228]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:47 compute-1 sudo[204353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgsjuuescvtdqstovhphoeuwifzmwqzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247686.5419865-1959-140381412498848/AnsiballZ_copy.py'
Feb 16 13:14:47 compute-1 sudo[204353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:47 compute-1 python3.9[204355]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771247686.5419865-1959-140381412498848/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:47 compute-1 sudo[204353]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:47 compute-1 sudo[204505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miltqimjcyntmqfdntepqeubfauabezp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247687.697778-1989-182698197100606/AnsiballZ_file.py'
Feb 16 13:14:47 compute-1 sudo[204505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:48 compute-1 python3.9[204507]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:48 compute-1 sudo[204505]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:48 compute-1 sudo[204657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvjhfieppnbmxtgzuljkivjpmbklceyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247688.3026466-2005-234519544794277/AnsiballZ_command.py'
Feb 16 13:14:48 compute-1 sudo[204657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:48 compute-1 python3.9[204659]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:14:48 compute-1 sudo[204657]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:50 compute-1 sudo[204812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqlkejsfcszhswxsltpphjkflfddvacq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247688.9298928-2021-35520014722142/AnsiballZ_blockinfile.py'
Feb 16 13:14:50 compute-1 sudo[204812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:50 compute-1 python3.9[204814]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:50 compute-1 sudo[204812]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:50 compute-1 sudo[204964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbsbvgmeeunwrofyvdgjjbtamviatelk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247690.5691252-2039-56671048872629/AnsiballZ_command.py'
Feb 16 13:14:50 compute-1 sudo[204964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:50 compute-1 python3.9[204966]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:14:51 compute-1 sudo[204964]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:51 compute-1 sudo[205117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjsigeploxsaknmwhztdlzdkrfofcszu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247691.1716104-2055-144190072987938/AnsiballZ_stat.py'
Feb 16 13:14:51 compute-1 sudo[205117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:51 compute-1 python3.9[205119]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 13:14:51 compute-1 sudo[205117]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:52 compute-1 sudo[205284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfaambnmvmozvqogjkmkymfcfdsbcimf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247691.7687857-2071-145230839153971/AnsiballZ_command.py'
Feb 16 13:14:52 compute-1 sudo[205284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:52 compute-1 podman[205245]: 2026-02-16 13:14:52.005800828 +0000 UTC m=+0.048339963 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:14:52 compute-1 python3.9[205297]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 13:14:52 compute-1 sudo[205284]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:52 compute-1 openstack_network_exporter[198096]: ERROR   13:14:52 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:14:52 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:14:52 compute-1 openstack_network_exporter[198096]: ERROR   13:14:52 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:14:52 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:14:52 compute-1 sudo[205455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uztxfofqljlzofpipdletsdvfffdqnii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771247692.3772118-2087-80650683944714/AnsiballZ_file.py'
Feb 16 13:14:52 compute-1 sudo[205455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:14:52 compute-1 python3.9[205457]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 13:14:52 compute-1 sudo[205455]: pam_unix(sudo:session): session closed for user root
Feb 16 13:14:53 compute-1 sshd-session[186213]: Connection closed by 192.168.122.30 port 54678
Feb 16 13:14:53 compute-1 sshd-session[186210]: pam_unix(sshd:session): session closed for user zuul
Feb 16 13:14:53 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Feb 16 13:14:53 compute-1 systemd[1]: session-27.scope: Consumed 1min 6.824s CPU time.
Feb 16 13:14:53 compute-1 systemd-logind[821]: Session 27 logged out. Waiting for processes to exit.
Feb 16 13:14:53 compute-1 systemd-logind[821]: Removed session 27.
Feb 16 13:14:53 compute-1 sshd-session[205482]: Invalid user ubuntu from 2.57.122.210 port 33616
Feb 16 13:14:53 compute-1 sshd-session[205482]: Connection closed by invalid user ubuntu 2.57.122.210 port 33616 [preauth]
Feb 16 13:14:53 compute-1 nova_compute[185910]: 2026-02-16 13:14:53.940 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:53 compute-1 nova_compute[185910]: 2026-02-16 13:14:53.973 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:53 compute-1 nova_compute[185910]: 2026-02-16 13:14:53.999 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:53.999 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.000 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.000 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.137 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.138 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5961MB free_disk=73.26239013671875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.138 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.138 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.203 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.203 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.229 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.245 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.247 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.247 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.906 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.906 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.906 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:54 compute-1 nova_compute[185910]: 2026-02-16 13:14:54.906 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:55 compute-1 nova_compute[185910]: 2026-02-16 13:14:55.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:55 compute-1 nova_compute[185910]: 2026-02-16 13:14:55.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:14:55 compute-1 nova_compute[185910]: 2026-02-16 13:14:55.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:14:55 compute-1 nova_compute[185910]: 2026-02-16 13:14:55.657 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:14:55 compute-1 nova_compute[185910]: 2026-02-16 13:14:55.657 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:55 compute-1 nova_compute[185910]: 2026-02-16 13:14:55.657 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:55 compute-1 nova_compute[185910]: 2026-02-16 13:14:55.658 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:14:55 compute-1 nova_compute[185910]: 2026-02-16 13:14:55.658 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:14:59 compute-1 sshd-session[205484]: Connection closed by authenticating user root 146.190.226.24 port 52438 [preauth]
Feb 16 13:15:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:15:03.319 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:15:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:15:03.319 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:15:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:15:03.320 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:15:05 compute-1 podman[195236]: time="2026-02-16T13:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:15:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:15:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2139 "" "Go-http-client/1.1"
Feb 16 13:15:08 compute-1 podman[205489]: 2026-02-16 13:15:08.931489817 +0000 UTC m=+0.076819981 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Feb 16 13:15:09 compute-1 podman[205510]: 2026-02-16 13:15:09.01690009 +0000 UTC m=+0.046758438 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 13:15:13 compute-1 podman[205531]: 2026-02-16 13:15:13.921880008 +0000 UTC m=+0.068325450 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:15:19 compute-1 openstack_network_exporter[198096]: ERROR   13:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:15:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:15:19 compute-1 openstack_network_exporter[198096]: ERROR   13:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:15:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:15:22 compute-1 podman[205558]: 2026-02-16 13:15:22.907008547 +0000 UTC m=+0.047525619 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:15:35 compute-1 podman[195236]: time="2026-02-16T13:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:15:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:15:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2139 "" "Go-http-client/1.1"
Feb 16 13:15:39 compute-1 podman[205583]: 2026-02-16 13:15:39.902851246 +0000 UTC m=+0.044438842 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 16 13:15:39 compute-1 podman[205582]: 2026-02-16 13:15:39.902851256 +0000 UTC m=+0.046143230 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.7, architecture=x86_64, distribution-scope=public)
Feb 16 13:15:43 compute-1 sshd-session[205619]: Connection closed by authenticating user root 188.166.42.159 port 44720 [preauth]
Feb 16 13:15:44 compute-1 podman[205621]: 2026-02-16 13:15:44.942948527 +0000 UTC m=+0.070914452 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.655 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.655 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.656 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.656 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.823 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.824 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6101MB free_disk=73.26244735717773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.824 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.824 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:15:53 compute-1 podman[205647]: 2026-02-16 13:15:53.903551289 +0000 UTC m=+0.047655339 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.922 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.923 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.946 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.960 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.962 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:15:53 compute-1 nova_compute[185910]: 2026-02-16 13:15:53.962 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:15:54 compute-1 nova_compute[185910]: 2026-02-16 13:15:54.962 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:54 compute-1 nova_compute[185910]: 2026-02-16 13:15:54.963 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:55 compute-1 nova_compute[185910]: 2026-02-16 13:15:55.627 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:55 compute-1 nova_compute[185910]: 2026-02-16 13:15:55.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:55 compute-1 nova_compute[185910]: 2026-02-16 13:15:55.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:56 compute-1 nova_compute[185910]: 2026-02-16 13:15:56.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:56 compute-1 nova_compute[185910]: 2026-02-16 13:15:56.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:15:56 compute-1 nova_compute[185910]: 2026-02-16 13:15:56.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:15:56 compute-1 nova_compute[185910]: 2026-02-16 13:15:56.659 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:15:57 compute-1 nova_compute[185910]: 2026-02-16 13:15:57.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:57 compute-1 nova_compute[185910]: 2026-02-16 13:15:57.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:15:57 compute-1 nova_compute[185910]: 2026-02-16 13:15:57.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:16:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:16:01.495 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:16:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:16:01.496 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:16:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:16:01.497 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:16:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:16:03.320 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:16:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:16:03.321 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:16:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:16:03.321 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:16:05 compute-1 sshd-session[205672]: Connection closed by authenticating user root 146.190.226.24 port 56262 [preauth]
Feb 16 13:16:10 compute-1 podman[205675]: 2026-02-16 13:16:10.908133774 +0000 UTC m=+0.046116017 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:16:10 compute-1 podman[205674]: 2026-02-16 13:16:10.909393088 +0000 UTC m=+0.050154306 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=)
Feb 16 13:16:15 compute-1 podman[205713]: 2026-02-16 13:16:15.94665495 +0000 UTC m=+0.093402116 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 16 13:16:19 compute-1 openstack_network_exporter[198096]: ERROR   13:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:16:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:16:19 compute-1 openstack_network_exporter[198096]: ERROR   13:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:16:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:16:24 compute-1 podman[205739]: 2026-02-16 13:16:24.904150126 +0000 UTC m=+0.049663763 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:16:35 compute-1 podman[195236]: time="2026-02-16T13:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:16:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:16:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2146 "" "Go-http-client/1.1"
Feb 16 13:16:38 compute-1 sshd-session[205764]: Connection closed by authenticating user root 188.166.42.159 port 58700 [preauth]
Feb 16 13:16:41 compute-1 podman[205766]: 2026-02-16 13:16:41.900008951 +0000 UTC m=+0.045763958 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, release=1770267347, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.7, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Feb 16 13:16:41 compute-1 podman[205767]: 2026-02-16 13:16:41.907951566 +0000 UTC m=+0.048445640 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:16:46 compute-1 podman[205807]: 2026-02-16 13:16:46.9452168 +0000 UTC m=+0.063061555 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:16:49 compute-1 openstack_network_exporter[198096]: ERROR   13:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:16:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:16:49 compute-1 openstack_network_exporter[198096]: ERROR   13:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:16:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.686 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.687 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.687 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.687 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.819 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.820 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6151MB free_disk=73.26248931884766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.820 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.821 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.935 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.936 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:16:54 compute-1 nova_compute[185910]: 2026-02-16 13:16:54.985 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:16:55 compute-1 nova_compute[185910]: 2026-02-16 13:16:55.005 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:16:55 compute-1 nova_compute[185910]: 2026-02-16 13:16:55.006 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:16:55 compute-1 nova_compute[185910]: 2026-02-16 13:16:55.007 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:16:55 compute-1 podman[205834]: 2026-02-16 13:16:55.923948199 +0000 UTC m=+0.068962414 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:16:56 compute-1 nova_compute[185910]: 2026-02-16 13:16:55.999 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:56 compute-1 nova_compute[185910]: 2026-02-16 13:16:56.000 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:56 compute-1 nova_compute[185910]: 2026-02-16 13:16:56.000 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:56 compute-1 nova_compute[185910]: 2026-02-16 13:16:56.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:56 compute-1 nova_compute[185910]: 2026-02-16 13:16:56.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:57 compute-1 nova_compute[185910]: 2026-02-16 13:16:57.627 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:57 compute-1 nova_compute[185910]: 2026-02-16 13:16:57.719 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:57 compute-1 nova_compute[185910]: 2026-02-16 13:16:57.719 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:16:57 compute-1 nova_compute[185910]: 2026-02-16 13:16:57.719 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:16:57 compute-1 nova_compute[185910]: 2026-02-16 13:16:57.753 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:16:57 compute-1 nova_compute[185910]: 2026-02-16 13:16:57.754 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:59 compute-1 nova_compute[185910]: 2026-02-16 13:16:59.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:16:59 compute-1 nova_compute[185910]: 2026-02-16 13:16:59.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:16:59 compute-1 sshd-session[205859]: Invalid user solana from 2.57.122.210 port 36314
Feb 16 13:17:00 compute-1 sshd-session[205859]: Connection closed by invalid user solana 2.57.122.210 port 36314 [preauth]
Feb 16 13:17:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:17:03.322 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:17:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:17:03.323 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:17:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:17:03.323 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:17:05 compute-1 podman[195236]: time="2026-02-16T13:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:17:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:17:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2145 "" "Go-http-client/1.1"
Feb 16 13:17:11 compute-1 sshd-session[205861]: Connection closed by authenticating user root 146.190.226.24 port 48240 [preauth]
Feb 16 13:17:12 compute-1 podman[205864]: 2026-02-16 13:17:12.905669037 +0000 UTC m=+0.043309168 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:17:12 compute-1 podman[205863]: 2026-02-16 13:17:12.939103131 +0000 UTC m=+0.079611489 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, release=1770267347, container_name=openstack_network_exporter, version=9.7, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 16 13:17:17 compute-1 podman[205903]: 2026-02-16 13:17:17.916339131 +0000 UTC m=+0.059329837 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:17:19 compute-1 openstack_network_exporter[198096]: ERROR   13:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:17:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:17:19 compute-1 openstack_network_exporter[198096]: ERROR   13:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:17:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:17:26 compute-1 podman[205930]: 2026-02-16 13:17:26.92499417 +0000 UTC m=+0.065148112 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:17:33 compute-1 sshd-session[205954]: Connection closed by authenticating user root 188.166.42.159 port 36834 [preauth]
Feb 16 13:17:35 compute-1 podman[195236]: time="2026-02-16T13:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:17:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:17:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2142 "" "Go-http-client/1.1"
Feb 16 13:17:43 compute-1 podman[205956]: 2026-02-16 13:17:43.913013915 +0000 UTC m=+0.052450312 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1770267347, config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7)
Feb 16 13:17:43 compute-1 podman[205957]: 2026-02-16 13:17:43.913001525 +0000 UTC m=+0.048166848 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:17:48 compute-1 podman[205997]: 2026-02-16 13:17:48.943195019 +0000 UTC m=+0.089718649 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 13:17:49 compute-1 openstack_network_exporter[198096]: ERROR   13:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:17:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:17:49 compute-1 openstack_network_exporter[198096]: ERROR   13:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:17:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:17:53 compute-1 nova_compute[185910]: 2026-02-16 13:17:53.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:53 compute-1 nova_compute[185910]: 2026-02-16 13:17:53.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:17:53 compute-1 nova_compute[185910]: 2026-02-16 13:17:53.653 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:17:53 compute-1 nova_compute[185910]: 2026-02-16 13:17:53.654 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:53 compute-1 nova_compute[185910]: 2026-02-16 13:17:53.655 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:17:53 compute-1 nova_compute[185910]: 2026-02-16 13:17:53.674 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:55 compute-1 nova_compute[185910]: 2026-02-16 13:17:55.690 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.728 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.729 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.729 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.729 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.851 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.853 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6187MB free_disk=73.26248931884766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.853 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.853 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.988 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:17:56 compute-1 nova_compute[185910]: 2026-02-16 13:17:56.989 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:17:57 compute-1 nova_compute[185910]: 2026-02-16 13:17:57.093 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:17:57 compute-1 nova_compute[185910]: 2026-02-16 13:17:57.147 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:17:57 compute-1 nova_compute[185910]: 2026-02-16 13:17:57.148 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:17:57 compute-1 nova_compute[185910]: 2026-02-16 13:17:57.173 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:17:57 compute-1 nova_compute[185910]: 2026-02-16 13:17:57.196 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:17:57 compute-1 nova_compute[185910]: 2026-02-16 13:17:57.216 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:17:57 compute-1 nova_compute[185910]: 2026-02-16 13:17:57.235 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:17:57 compute-1 nova_compute[185910]: 2026-02-16 13:17:57.236 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:17:57 compute-1 nova_compute[185910]: 2026-02-16 13:17:57.237 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:17:57 compute-1 podman[206024]: 2026-02-16 13:17:57.901955826 +0000 UTC m=+0.042060247 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:17:58 compute-1 nova_compute[185910]: 2026-02-16 13:17:58.231 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:58 compute-1 nova_compute[185910]: 2026-02-16 13:17:58.232 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:58 compute-1 nova_compute[185910]: 2026-02-16 13:17:58.232 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:58 compute-1 nova_compute[185910]: 2026-02-16 13:17:58.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:17:58 compute-1 nova_compute[185910]: 2026-02-16 13:17:58.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:17:58 compute-1 nova_compute[185910]: 2026-02-16 13:17:58.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:17:58 compute-1 nova_compute[185910]: 2026-02-16 13:17:58.652 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:17:58 compute-1 nova_compute[185910]: 2026-02-16 13:17:58.652 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:00 compute-1 nova_compute[185910]: 2026-02-16 13:18:00.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:00 compute-1 nova_compute[185910]: 2026-02-16 13:18:00.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:18:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:18:03.324 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:18:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:18:03.324 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:18:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:18:03.325 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:18:05 compute-1 podman[195236]: time="2026-02-16T13:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:18:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:18:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2145 "" "Go-http-client/1.1"
Feb 16 13:18:14 compute-1 podman[206049]: 2026-02-16 13:18:14.902721323 +0000 UTC m=+0.048601640 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1770267347, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 13:18:14 compute-1 podman[206050]: 2026-02-16 13:18:14.928867967 +0000 UTC m=+0.071357334 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 16 13:18:19 compute-1 openstack_network_exporter[198096]: ERROR   13:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:18:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:18:19 compute-1 openstack_network_exporter[198096]: ERROR   13:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:18:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:18:19 compute-1 podman[206091]: 2026-02-16 13:18:19.946992398 +0000 UTC m=+0.093677856 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 16 13:18:21 compute-1 sshd-session[206118]: Connection closed by authenticating user root 146.190.226.24 port 52712 [preauth]
Feb 16 13:18:28 compute-1 podman[206120]: 2026-02-16 13:18:28.927906072 +0000 UTC m=+0.069528795 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:18:30 compute-1 sshd-session[206144]: Connection closed by authenticating user root 188.166.42.159 port 50812 [preauth]
Feb 16 13:18:35 compute-1 podman[195236]: time="2026-02-16T13:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:18:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:18:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2147 "" "Go-http-client/1.1"
Feb 16 13:18:45 compute-1 podman[206147]: 2026-02-16 13:18:45.913841014 +0000 UTC m=+0.053215792 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 16 13:18:45 compute-1 podman[206146]: 2026-02-16 13:18:45.915244731 +0000 UTC m=+0.059716494 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9)
Feb 16 13:18:49 compute-1 openstack_network_exporter[198096]: ERROR   13:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:18:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:18:49 compute-1 openstack_network_exporter[198096]: ERROR   13:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:18:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:18:50 compute-1 podman[206183]: 2026-02-16 13:18:50.930877737 +0000 UTC m=+0.072118843 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 13:18:56 compute-1 nova_compute[185910]: 2026-02-16 13:18:56.637 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:57 compute-1 nova_compute[185910]: 2026-02-16 13:18:57.629 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.635 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.636 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.636 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.637 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.670 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.670 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.671 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.671 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.802 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.803 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6184MB free_disk=73.26245880126953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.803 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.803 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.936 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.937 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.966 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.991 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.993 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:18:58 compute-1 nova_compute[185910]: 2026-02-16 13:18:58.993 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:18:59 compute-1 podman[206209]: 2026-02-16 13:18:59.907276861 +0000 UTC m=+0.043762042 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:18:59 compute-1 nova_compute[185910]: 2026-02-16 13:18:59.990 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:18:59 compute-1 nova_compute[185910]: 2026-02-16 13:18:59.990 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:18:59 compute-1 nova_compute[185910]: 2026-02-16 13:18:59.990 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:19:00 compute-1 nova_compute[185910]: 2026-02-16 13:19:00.009 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:19:00 compute-1 nova_compute[185910]: 2026-02-16 13:19:00.009 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:00 compute-1 nova_compute[185910]: 2026-02-16 13:19:00.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:00 compute-1 nova_compute[185910]: 2026-02-16 13:19:00.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:19:02 compute-1 nova_compute[185910]: 2026-02-16 13:19:02.628 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:19:03.326 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:19:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:19:03.326 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:19:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:19:03.326 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:19:05 compute-1 podman[195236]: time="2026-02-16T13:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:19:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:19:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2149 "" "Go-http-client/1.1"
Feb 16 13:19:16 compute-1 podman[206233]: 2026-02-16 13:19:16.904357156 +0000 UTC m=+0.048853937 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 16 13:19:16 compute-1 podman[206234]: 2026-02-16 13:19:16.904274274 +0000 UTC m=+0.045267602 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_managed=true)
Feb 16 13:19:18 compute-1 sshd-session[206274]: Invalid user solana from 2.57.122.210 port 39066
Feb 16 13:19:18 compute-1 sshd-session[206274]: Connection closed by invalid user solana 2.57.122.210 port 39066 [preauth]
Feb 16 13:19:19 compute-1 openstack_network_exporter[198096]: ERROR   13:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:19:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:19:19 compute-1 openstack_network_exporter[198096]: ERROR   13:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:19:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:19:21 compute-1 podman[206276]: 2026-02-16 13:19:21.919831226 +0000 UTC m=+0.065471212 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 16 13:19:25 compute-1 sshd-session[206302]: Connection closed by authenticating user root 188.166.42.159 port 45544 [preauth]
Feb 16 13:19:30 compute-1 podman[206304]: 2026-02-16 13:19:30.926009992 +0000 UTC m=+0.068495273 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:19:32 compute-1 sshd-session[206329]: Connection closed by authenticating user root 146.190.226.24 port 49412 [preauth]
Feb 16 13:19:35 compute-1 podman[195236]: time="2026-02-16T13:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:19:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:19:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Feb 16 13:19:47 compute-1 podman[206332]: 2026-02-16 13:19:47.899772646 +0000 UTC m=+0.041480961 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:19:47 compute-1 podman[206331]: 2026-02-16 13:19:47.900026532 +0000 UTC m=+0.046180826 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9/ubi-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1770267347)
Feb 16 13:19:49 compute-1 openstack_network_exporter[198096]: ERROR   13:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:19:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:19:49 compute-1 openstack_network_exporter[198096]: ERROR   13:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:19:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:19:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:19:51.755 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:19:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:19:51.756 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:19:52 compute-1 podman[206371]: 2026-02-16 13:19:52.930900584 +0000 UTC m=+0.073636039 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:19:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:19:55.758 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:19:57 compute-1 nova_compute[185910]: 2026-02-16 13:19:57.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:57 compute-1 nova_compute[185910]: 2026-02-16 13:19:57.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:58 compute-1 nova_compute[185910]: 2026-02-16 13:19:58.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:58 compute-1 nova_compute[185910]: 2026-02-16 13:19:58.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:59 compute-1 nova_compute[185910]: 2026-02-16 13:19:59.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:19:59 compute-1 nova_compute[185910]: 2026-02-16 13:19:59.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:19:59 compute-1 nova_compute[185910]: 2026-02-16 13:19:59.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:19:59 compute-1 nova_compute[185910]: 2026-02-16 13:19:59.667 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:19:59 compute-1 nova_compute[185910]: 2026-02-16 13:19:59.668 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.675 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.676 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.676 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.677 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.819 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.820 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6176MB free_disk=73.2624397277832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.820 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.821 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.948 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:20:00 compute-1 nova_compute[185910]: 2026-02-16 13:20:00.949 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:20:01 compute-1 nova_compute[185910]: 2026-02-16 13:20:01.027 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:20:01 compute-1 nova_compute[185910]: 2026-02-16 13:20:01.272 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:20:01 compute-1 nova_compute[185910]: 2026-02-16 13:20:01.273 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:20:01 compute-1 nova_compute[185910]: 2026-02-16 13:20:01.274 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:01 compute-1 podman[206398]: 2026-02-16 13:20:01.9053386 +0000 UTC m=+0.051173999 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:20:02 compute-1 nova_compute[185910]: 2026-02-16 13:20:02.273 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:02 compute-1 nova_compute[185910]: 2026-02-16 13:20:02.273 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:20:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:20:03.327 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:20:03.327 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:20:03.327 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:05 compute-1 podman[195236]: time="2026-02-16T13:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:20:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:20:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2144 "" "Go-http-client/1.1"
Feb 16 13:20:18 compute-1 podman[206423]: 2026-02-16 13:20:18.914312524 +0000 UTC m=+0.054601040 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:20:18 compute-1 podman[206424]: 2026-02-16 13:20:18.930769853 +0000 UTC m=+0.071953822 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:20:19 compute-1 openstack_network_exporter[198096]: ERROR   13:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:20:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:20:19 compute-1 openstack_network_exporter[198096]: ERROR   13:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:20:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:20:23 compute-1 sshd-session[206461]: Connection closed by authenticating user root 188.166.42.159 port 38346 [preauth]
Feb 16 13:20:23 compute-1 podman[206463]: 2026-02-16 13:20:23.966983682 +0000 UTC m=+0.102427706 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller)
Feb 16 13:20:32 compute-1 podman[206489]: 2026-02-16 13:20:32.906950444 +0000 UTC m=+0.051178258 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:20:35 compute-1 podman[195236]: time="2026-02-16T13:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:20:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:20:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2151 "" "Go-http-client/1.1"
Feb 16 13:20:42 compute-1 sshd-session[206513]: Connection closed by authenticating user root 146.190.226.24 port 52002 [preauth]
Feb 16 13:20:49 compute-1 openstack_network_exporter[198096]: ERROR   13:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:20:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:20:49 compute-1 openstack_network_exporter[198096]: ERROR   13:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:20:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:20:49 compute-1 podman[206516]: 2026-02-16 13:20:49.902058335 +0000 UTC m=+0.040810061 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 13:20:49 compute-1 podman[206515]: 2026-02-16 13:20:49.930944566 +0000 UTC m=+0.074338496 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, release=1770267347, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.186 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.186 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.203 185914 DEBUG nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.296 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.296 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.305 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.305 185914 INFO nova.compute.claims [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.428 185914 DEBUG nova.compute.provider_tree [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.456 185914 DEBUG nova.scheduler.client.report [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.482 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.483 185914 DEBUG nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.560 185914 DEBUG nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.561 185914 DEBUG nova.network.neutron [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.589 185914 INFO nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.613 185914 DEBUG nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.741 185914 DEBUG nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.743 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.744 185914 INFO nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Creating image(s)
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.745 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "/var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.745 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.746 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.747 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:51 compute-1 nova_compute[185910]: 2026-02-16 13:20:51.748 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:52 compute-1 nova_compute[185910]: 2026-02-16 13:20:52.823 185914 WARNING oslo_policy.policy [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 16 13:20:52 compute-1 nova_compute[185910]: 2026-02-16 13:20:52.824 185914 WARNING oslo_policy.policy [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 16 13:20:52 compute-1 nova_compute[185910]: 2026-02-16 13:20:52.826 185914 DEBUG nova.policy [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53b5045c5aaf4a7d8d84dce2ac4aa424', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:20:53 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:20:53.820 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:20:53 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:20:53.821 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:20:53 compute-1 nova_compute[185910]: 2026-02-16 13:20:53.881 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:53 compute-1 nova_compute[185910]: 2026-02-16 13:20:53.931 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.part --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:53 compute-1 nova_compute[185910]: 2026-02-16 13:20:53.933 185914 DEBUG nova.virt.images [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] 6fb9af7f-2971-4890-a777-6e99e888717f was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 16 13:20:53 compute-1 nova_compute[185910]: 2026-02-16 13:20:53.934 185914 DEBUG nova.privsep.utils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 16 13:20:53 compute-1 nova_compute[185910]: 2026-02-16 13:20:53.934 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.part /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.058 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.part /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.converted" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.061 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.100 185914 DEBUG nova.network.neutron [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Successfully created port: ec35d953-ee21-47b6-bef7-1618058f79be _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.106 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7.converted --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.107 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.119 185914 INFO oslo.privsep.daemon [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp5_mq2cln/privsep.sock']
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.886 185914 INFO oslo.privsep.daemon [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Spawned new privsep daemon via rootwrap
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.670 206572 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.674 206572 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.676 206572 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 16 13:20:54 compute-1 nova_compute[185910]: 2026-02-16 13:20:54.676 206572 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206572
Feb 16 13:20:54 compute-1 podman[206573]: 2026-02-16 13:20:54.970539755 +0000 UTC m=+0.114651874 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.020 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.067 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.068 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.069 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.079 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.124 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.126 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.152 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.153 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.154 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.201 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.202 185914 DEBUG nova.virt.disk.api [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Checking if we can resize image /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.202 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.246 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.247 185914 DEBUG nova.virt.disk.api [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Cannot resize image /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.248 185914 DEBUG nova.objects.instance [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'migration_context' on Instance uuid 5021a07d-59d2-49c7-b92f-0c25c5dc1222 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.266 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.266 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Ensure instance console log exists: /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.267 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.267 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:55 compute-1 nova_compute[185910]: 2026-02-16 13:20:55.267 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:56 compute-1 nova_compute[185910]: 2026-02-16 13:20:56.229 185914 DEBUG nova.network.neutron [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Successfully updated port: ec35d953-ee21-47b6-bef7-1618058f79be _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:20:56 compute-1 nova_compute[185910]: 2026-02-16 13:20:56.254 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:20:56 compute-1 nova_compute[185910]: 2026-02-16 13:20:56.255 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquired lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:20:56 compute-1 nova_compute[185910]: 2026-02-16 13:20:56.255 185914 DEBUG nova.network.neutron [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:20:56 compute-1 nova_compute[185910]: 2026-02-16 13:20:56.387 185914 DEBUG nova.compute.manager [req-b5ddf945-0131-4eab-a563-d6a744e016a4 req-5dc6d917-6ab3-4ba1-9b98-6034afb38f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Received event network-changed-ec35d953-ee21-47b6-bef7-1618058f79be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:20:56 compute-1 nova_compute[185910]: 2026-02-16 13:20:56.387 185914 DEBUG nova.compute.manager [req-b5ddf945-0131-4eab-a563-d6a744e016a4 req-5dc6d917-6ab3-4ba1-9b98-6034afb38f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Refreshing instance network info cache due to event network-changed-ec35d953-ee21-47b6-bef7-1618058f79be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:20:56 compute-1 nova_compute[185910]: 2026-02-16 13:20:56.387 185914 DEBUG oslo_concurrency.lockutils [req-b5ddf945-0131-4eab-a563-d6a744e016a4 req-5dc6d917-6ab3-4ba1-9b98-6034afb38f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:20:56 compute-1 nova_compute[185910]: 2026-02-16 13:20:56.780 185914 DEBUG nova.network.neutron [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.828 185914 DEBUG nova.network.neutron [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Updating instance_info_cache with network_info: [{"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.888 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Releasing lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.888 185914 DEBUG nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Instance network_info: |[{"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.889 185914 DEBUG oslo_concurrency.lockutils [req-b5ddf945-0131-4eab-a563-d6a744e016a4 req-5dc6d917-6ab3-4ba1-9b98-6034afb38f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.889 185914 DEBUG nova.network.neutron [req-b5ddf945-0131-4eab-a563-d6a744e016a4 req-5dc6d917-6ab3-4ba1-9b98-6034afb38f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Refreshing network info cache for port ec35d953-ee21-47b6-bef7-1618058f79be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.893 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Start _get_guest_xml network_info=[{"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.901 185914 WARNING nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.908 185914 DEBUG nova.virt.libvirt.host [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.908 185914 DEBUG nova.virt.libvirt.host [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.913 185914 DEBUG nova.virt.libvirt.host [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.913 185914 DEBUG nova.virt.libvirt.host [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.915 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.915 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.916 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.916 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.916 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.917 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.917 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.917 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.917 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.918 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.918 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.918 185914 DEBUG nova.virt.hardware [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.922 185914 DEBUG nova.privsep.utils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.924 185914 DEBUG nova.virt.libvirt.vif [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:20:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-836105514',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-836105514',id=2,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-90mzu2dc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:20:51Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=5021a07d-59d2-49c7-b92f-0c25c5dc1222,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.924 185914 DEBUG nova.network.os_vif_util [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.925 185914 DEBUG nova.network.os_vif_util [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:da:c0,bridge_name='br-int',has_traffic_filtering=True,id=ec35d953-ee21-47b6-bef7-1618058f79be,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35d953-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.927 185914 DEBUG nova.objects.instance [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'pci_devices' on Instance uuid 5021a07d-59d2-49c7-b92f-0c25c5dc1222 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.945 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <uuid>5021a07d-59d2-49c7-b92f-0c25c5dc1222</uuid>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <name>instance-00000002</name>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-836105514</nova:name>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:20:57</nova:creationTime>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:20:57 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:20:57 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:20:57 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:20:57 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:20:57 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:20:57 compute-1 nova_compute[185910]:         <nova:user uuid="53b5045c5aaf4a7d8d84dce2ac4aa424">tempest-TestExecuteActionsViaActuator-1504038973-project-member</nova:user>
Feb 16 13:20:57 compute-1 nova_compute[185910]:         <nova:project uuid="b5e0321e3a614b62a46eef7fb2e737ff">tempest-TestExecuteActionsViaActuator-1504038973</nova:project>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:20:57 compute-1 nova_compute[185910]:         <nova:port uuid="ec35d953-ee21-47b6-bef7-1618058f79be">
Feb 16 13:20:57 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <system>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <entry name="serial">5021a07d-59d2-49c7-b92f-0c25c5dc1222</entry>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <entry name="uuid">5021a07d-59d2-49c7-b92f-0c25c5dc1222</entry>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     </system>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <os>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   </os>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <features>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   </features>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk.config"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:be:da:c0"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <target dev="tapec35d953-ee"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/console.log" append="off"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <video>
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     </video>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:20:57 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:20:57 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:20:57 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:20:57 compute-1 nova_compute[185910]: </domain>
Feb 16 13:20:57 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.946 185914 DEBUG nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Preparing to wait for external event network-vif-plugged-ec35d953-ee21-47b6-bef7-1618058f79be prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.947 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.947 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.947 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.948 185914 DEBUG nova.virt.libvirt.vif [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:20:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-836105514',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-836105514',id=2,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-90mzu2dc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:20:51Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=5021a07d-59d2-49c7-b92f-0c25c5dc1222,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.948 185914 DEBUG nova.network.os_vif_util [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.949 185914 DEBUG nova.network.os_vif_util [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:da:c0,bridge_name='br-int',has_traffic_filtering=True,id=ec35d953-ee21-47b6-bef7-1618058f79be,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35d953-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.950 185914 DEBUG os_vif [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:da:c0,bridge_name='br-int',has_traffic_filtering=True,id=ec35d953-ee21-47b6-bef7-1618058f79be,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35d953-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.984 185914 DEBUG ovsdbapp.backend.ovs_idl [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.984 185914 DEBUG ovsdbapp.backend.ovs_idl [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.985 185914 DEBUG ovsdbapp.backend.ovs_idl [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.986 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.986 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.986 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.987 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.998 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.999 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:20:57 compute-1 nova_compute[185910]: 2026-02-16 13:20:57.999 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:20:58 compute-1 nova_compute[185910]: 2026-02-16 13:20:58.000 185914 INFO oslo.privsep.daemon [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp_nagfqth/privsep.sock']
Feb 16 13:20:58 compute-1 nova_compute[185910]: 2026-02-16 13:20:58.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:58 compute-1 nova_compute[185910]: 2026-02-16 13:20:58.703 185914 INFO oslo.privsep.daemon [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Spawned new privsep daemon via rootwrap
Feb 16 13:20:58 compute-1 nova_compute[185910]: 2026-02-16 13:20:58.540 206619 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:20:58 compute-1 nova_compute[185910]: 2026-02-16 13:20:58.543 206619 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:20:58 compute-1 nova_compute[185910]: 2026-02-16 13:20:58.545 206619 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 16 13:20:58 compute-1 nova_compute[185910]: 2026-02-16 13:20:58.546 206619 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206619
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.087 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.088 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec35d953-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.089 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapec35d953-ee, col_values=(('external_ids', {'iface-id': 'ec35d953-ee21-47b6-bef7-1618058f79be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:da:c0', 'vm-uuid': '5021a07d-59d2-49c7-b92f-0c25c5dc1222'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:20:59 compute-1 NetworkManager[56388]: <info>  [1771248059.0916] manager: (tapec35d953-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.093 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.096 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.097 185914 INFO os_vif [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:da:c0,bridge_name='br-int',has_traffic_filtering=True,id=ec35d953-ee21-47b6-bef7-1618058f79be,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35d953-ee')
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.155 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.156 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.157 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No VIF found with MAC fa:16:3e:be:da:c0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.157 185914 INFO nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Using config drive
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.556 185914 DEBUG nova.network.neutron [req-b5ddf945-0131-4eab-a563-d6a744e016a4 req-5dc6d917-6ab3-4ba1-9b98-6034afb38f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Updated VIF entry in instance network info cache for port ec35d953-ee21-47b6-bef7-1618058f79be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.557 185914 DEBUG nova.network.neutron [req-b5ddf945-0131-4eab-a563-d6a744e016a4 req-5dc6d917-6ab3-4ba1-9b98-6034afb38f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Updating instance_info_cache with network_info: [{"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.577 185914 DEBUG oslo_concurrency.lockutils [req-b5ddf945-0131-4eab-a563-d6a744e016a4 req-5dc6d917-6ab3-4ba1-9b98-6034afb38f45 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:20:59 compute-1 nova_compute[185910]: 2026-02-16 13:20:59.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:00 compute-1 nova_compute[185910]: 2026-02-16 13:21:00.053 185914 INFO nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Creating config drive at /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk.config
Feb 16 13:21:00 compute-1 nova_compute[185910]: 2026-02-16 13:21:00.058 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjlkb51t4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:00 compute-1 nova_compute[185910]: 2026-02-16 13:21:00.194 185914 DEBUG oslo_concurrency.processutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjlkb51t4" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:00 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 16 13:21:00 compute-1 kernel: tapec35d953-ee: entered promiscuous mode
Feb 16 13:21:00 compute-1 NetworkManager[56388]: <info>  [1771248060.2832] manager: (tapec35d953-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Feb 16 13:21:00 compute-1 ovn_controller[96285]: 2026-02-16T13:21:00Z|00027|binding|INFO|Claiming lport ec35d953-ee21-47b6-bef7-1618058f79be for this chassis.
Feb 16 13:21:00 compute-1 ovn_controller[96285]: 2026-02-16T13:21:00Z|00028|binding|INFO|ec35d953-ee21-47b6-bef7-1618058f79be: Claiming fa:16:3e:be:da:c0 10.100.0.10
Feb 16 13:21:00 compute-1 nova_compute[185910]: 2026-02-16 13:21:00.284 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:00 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:00.304 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:da:c0 10.100.0.10'], port_security=['fa:16:3e:be:da:c0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5021a07d-59d2-49c7-b92f-0c25c5dc1222', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=ec35d953-ee21-47b6-bef7-1618058f79be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:21:00 compute-1 systemd-udevd[206644]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:21:00 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:00.307 105573 INFO neutron.agent.ovn.metadata.agent [-] Port ec35d953-ee21-47b6-bef7-1618058f79be in datapath a6199784-1742-41a7-9152-bb54abb7bef1 bound to our chassis
Feb 16 13:21:00 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:00.309 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:21:00 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:00.311 105573 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpv9cnr_jr/privsep.sock']
Feb 16 13:21:00 compute-1 NetworkManager[56388]: <info>  [1771248060.3167] device (tapec35d953-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:21:00 compute-1 NetworkManager[56388]: <info>  [1771248060.3184] device (tapec35d953-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:21:00 compute-1 nova_compute[185910]: 2026-02-16 13:21:00.330 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:00 compute-1 ovn_controller[96285]: 2026-02-16T13:21:00Z|00029|binding|INFO|Setting lport ec35d953-ee21-47b6-bef7-1618058f79be ovn-installed in OVS
Feb 16 13:21:00 compute-1 ovn_controller[96285]: 2026-02-16T13:21:00Z|00030|binding|INFO|Setting lport ec35d953-ee21-47b6-bef7-1618058f79be up in Southbound
Feb 16 13:21:00 compute-1 nova_compute[185910]: 2026-02-16 13:21:00.334 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:00 compute-1 systemd-machined[155419]: New machine qemu-1-instance-00000002.
Feb 16 13:21:00 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Feb 16 13:21:00 compute-1 nova_compute[185910]: 2026-02-16 13:21:00.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:00 compute-1 nova_compute[185910]: 2026-02-16 13:21:00.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:00 compute-1 nova_compute[185910]: 2026-02-16 13:21:00.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.015 185914 DEBUG nova.compute.manager [req-e3294aba-dbe7-450a-b6d5-d54aee58a490 req-021f5a0a-eac1-4ee1-a1a7-e4c5653ec5c4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Received event network-vif-plugged-ec35d953-ee21-47b6-bef7-1618058f79be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.015 185914 DEBUG oslo_concurrency.lockutils [req-e3294aba-dbe7-450a-b6d5-d54aee58a490 req-021f5a0a-eac1-4ee1-a1a7-e4c5653ec5c4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.015 185914 DEBUG oslo_concurrency.lockutils [req-e3294aba-dbe7-450a-b6d5-d54aee58a490 req-021f5a0a-eac1-4ee1-a1a7-e4c5653ec5c4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.016 185914 DEBUG oslo_concurrency.lockutils [req-e3294aba-dbe7-450a-b6d5-d54aee58a490 req-021f5a0a-eac1-4ee1-a1a7-e4c5653ec5c4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.016 185914 DEBUG nova.compute.manager [req-e3294aba-dbe7-450a-b6d5-d54aee58a490 req-021f5a0a-eac1-4ee1-a1a7-e4c5653ec5c4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Processing event network-vif-plugged-ec35d953-ee21-47b6-bef7-1618058f79be _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.026 185914 DEBUG nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.027 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248061.0259573, 5021a07d-59d2-49c7-b92f-0c25c5dc1222 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.027 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] VM Started (Lifecycle Event)
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.041 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.048 185914 INFO nova.virt.libvirt.driver [-] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Instance spawned successfully.
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.050 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.063 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.066 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.076 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.076 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.077 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.077 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.078 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.078 185914 DEBUG nova.virt.libvirt.driver [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.086 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.087 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248061.026722, 5021a07d-59d2-49c7-b92f-0c25c5dc1222 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.087 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] VM Paused (Lifecycle Event)
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:01.142 105573 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:01.144 105573 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpv9cnr_jr/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:00.973 206668 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:00.979 206668 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:00.981 206668 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:00.981 206668 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206668
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:01.147 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3f0374-5cc3-4e4c-81fb-8afabc0d5799]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.149 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.153 185914 INFO nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Took 9.41 seconds to spawn the instance on the hypervisor.
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.154 185914 DEBUG nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.156 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248061.0309172, 5021a07d-59d2-49c7-b92f-0c25c5dc1222 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.156 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] VM Resumed (Lifecycle Event)
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.185 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.189 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.213 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.227 185914 INFO nova.compute.manager [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Took 9.97 seconds to build instance.
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.245 185914 DEBUG oslo_concurrency.lockutils [None req-f1dda29b-a5d7-46e9-97d0-1c3a9b69e8c9 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:01.685 206668 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:01.685 206668 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:01.685 206668 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.816 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.817 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.817 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:21:01 compute-1 nova_compute[185910]: 2026-02-16 13:21:01.817 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5021a07d-59d2-49c7-b92f-0c25c5dc1222 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:01 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:01.823 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:02.429 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[958cd18e-1871-46f8-ad6e-e71bdf628586]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:02.430 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa6199784-11 in ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:21:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:02.432 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa6199784-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:21:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:02.432 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[95d7d522-3b05-47d0-9a24-35600051b4f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:02.435 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[6e40e553-5521-41cf-8f20-c7350d51aae6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:02.557 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[cb7fd4aa-702b-4d57-bded-8e476ce068fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:02 compute-1 nova_compute[185910]: 2026-02-16 13:21:02.573 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:02.628 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1f9a8bde-b641-45bc-b5d7-fd1022a4b54c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:02.630 105573 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpuh9x8p6k/privsep.sock']
Feb 16 13:21:02 compute-1 nova_compute[185910]: 2026-02-16 13:21:02.969 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Updating instance_info_cache with network_info: [{"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:21:02 compute-1 nova_compute[185910]: 2026-02-16 13:21:02.993 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:02 compute-1 nova_compute[185910]: 2026-02-16 13:21:02.994 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:21:02 compute-1 nova_compute[185910]: 2026-02-16 13:21:02.994 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.060 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.060 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.061 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.061 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.116 185914 DEBUG nova.compute.manager [req-4ed17f63-fd28-4cdf-8468-c95fc7fa5114 req-c0fe3d4a-b4ae-4353-8a83-e992051210f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Received event network-vif-plugged-ec35d953-ee21-47b6-bef7-1618058f79be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.116 185914 DEBUG oslo_concurrency.lockutils [req-4ed17f63-fd28-4cdf-8468-c95fc7fa5114 req-c0fe3d4a-b4ae-4353-8a83-e992051210f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.116 185914 DEBUG oslo_concurrency.lockutils [req-4ed17f63-fd28-4cdf-8468-c95fc7fa5114 req-c0fe3d4a-b4ae-4353-8a83-e992051210f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.117 185914 DEBUG oslo_concurrency.lockutils [req-4ed17f63-fd28-4cdf-8468-c95fc7fa5114 req-c0fe3d4a-b4ae-4353-8a83-e992051210f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.117 185914 DEBUG nova.compute.manager [req-4ed17f63-fd28-4cdf-8468-c95fc7fa5114 req-c0fe3d4a-b4ae-4353-8a83-e992051210f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] No waiting events found dispatching network-vif-plugged-ec35d953-ee21-47b6-bef7-1618058f79be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.117 185914 WARNING nova.compute.manager [req-4ed17f63-fd28-4cdf-8468-c95fc7fa5114 req-c0fe3d4a-b4ae-4353-8a83-e992051210f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Received unexpected event network-vif-plugged-ec35d953-ee21-47b6-bef7-1618058f79be for instance with vm_state active and task_state None.
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.233 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:03 compute-1 podman[206684]: 2026-02-16 13:21:03.296280071 +0000 UTC m=+0.072506087 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.310 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.312 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.327 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.329 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.329 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.361 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.796 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.798 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5840MB free_disk=73.22752380371094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.798 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.799 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.942 105573 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.943 105573 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpuh9x8p6k/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.409 206713 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.584 206713 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.653 206713 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.653 206713 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206713
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.946 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 5021a07d-59d2-49c7-b92f-0c25c5dc1222 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:21:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:03.946 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5651db-e0bf-486c-9734-5eb4a5e4d08a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.947 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.947 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:21:03 compute-1 nova_compute[185910]: 2026-02-16 13:21:03.996 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.050 185914 ERROR nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [req-7c07ed6e-7702-43aa-ae76-e898f802d269] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 63898862-3dd6-49b3-9545-63882243296a.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-7c07ed6e-7702-43aa-ae76-e898f802d269"}]}
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.073 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.091 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.101 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.101 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.121 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.149 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.189 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.242 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updated inventory for provider 63898862-3dd6-49b3-9545-63882243296a with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.242 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.243 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.303 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.304 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:04.500 206713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:04.501 206713 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:04.501 206713 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.942 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.964 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:04 compute-1 nova_compute[185910]: 2026-02-16 13:21:04.965 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.508 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c93b7a-b511-4fbc-aaea-a52fb58cb282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 NetworkManager[56388]: <info>  [1771248065.5519] manager: (tapa6199784-10): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.550 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd3612b-0ab9-410f-9e72-238957136a57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 systemd-udevd[206725]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.583 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[2e663910-5c2b-456d-99f4-0f24103018cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.587 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[95759002-c2b1-4458-90f6-6ebeaee0b5db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 NetworkManager[56388]: <info>  [1771248065.6069] device (tapa6199784-10): carrier: link connected
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.612 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[5c0382a0-238e-4c82-a228-0a0284cb0389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.626 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e5dd64a0-0b80-4551-a023-3c5a94dde244]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415002, 'reachable_time': 21372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206743, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.636 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee9d65e-311b-44fc-aa57-7a56801cde72]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:b943'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415002, 'tstamp': 415002}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206744, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 podman[195236]: time="2026-02-16T13:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:21:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:21:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2160 "" "Go-http-client/1.1"
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.651 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[88511430-b707-45ea-ba78-b90ea02cb7a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415002, 'reachable_time': 21372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 206745, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.675 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3a224b41-ee8b-4504-88c1-a4c49cb34f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.713 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[947211d4-37e6-4665-8fef-46615f18fddf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.715 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.715 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.716 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:05 compute-1 nova_compute[185910]: 2026-02-16 13:21:05.717 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:05 compute-1 NetworkManager[56388]: <info>  [1771248065.7187] manager: (tapa6199784-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Feb 16 13:21:05 compute-1 kernel: tapa6199784-10: entered promiscuous mode
Feb 16 13:21:05 compute-1 nova_compute[185910]: 2026-02-16 13:21:05.719 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.723 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:05 compute-1 nova_compute[185910]: 2026-02-16 13:21:05.724 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:05 compute-1 ovn_controller[96285]: 2026-02-16T13:21:05Z|00031|binding|INFO|Releasing lport 3b5a298b-9fc2-4705-8faa-2b8cfb88937b from this chassis (sb_readonly=0)
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.727 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a6199784-1742-41a7-9152-bb54abb7bef1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a6199784-1742-41a7-9152-bb54abb7bef1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:21:05 compute-1 nova_compute[185910]: 2026-02-16 13:21:05.728 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.729 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ddd73461-ae5a-4953-859c-a045e976126e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.730 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/a6199784-1742-41a7-9152-bb54abb7bef1.pid.haproxy
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:21:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:05.732 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'env', 'PROCESS_TAG=haproxy-a6199784-1742-41a7-9152-bb54abb7bef1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a6199784-1742-41a7-9152-bb54abb7bef1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:21:06 compute-1 podman[206778]: 2026-02-16 13:21:06.286486254 +0000 UTC m=+0.021991508 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:21:06 compute-1 podman[206778]: 2026-02-16 13:21:06.388728245 +0000 UTC m=+0.124233729 container create 9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:21:06 compute-1 systemd[1]: Started libpod-conmon-9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b.scope.
Feb 16 13:21:06 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:21:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23d7042eadfbaad7b78f21ea23482de4d076e2b67c8702c3efa23c00b613fb06/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:21:06 compute-1 podman[206778]: 2026-02-16 13:21:06.49560852 +0000 UTC m=+0.231113804 container init 9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 16 13:21:06 compute-1 podman[206778]: 2026-02-16 13:21:06.504808376 +0000 UTC m=+0.240313620 container start 9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:21:06 compute-1 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206793]: [NOTICE]   (206797) : New worker (206799) forked
Feb 16 13:21:06 compute-1 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206793]: [NOTICE]   (206797) : Loading success.
Feb 16 13:21:07 compute-1 nova_compute[185910]: 2026-02-16 13:21:07.574 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:09 compute-1 nova_compute[185910]: 2026-02-16 13:21:09.094 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.157 185914 DEBUG nova.compute.manager [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.266 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.267 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.486 185914 DEBUG nova.objects.instance [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'pci_requests' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.504 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.505 185914 INFO nova.compute.claims [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.506 185914 DEBUG nova.objects.instance [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'resources' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.518 185914 DEBUG nova.objects.instance [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'pci_devices' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.562 185914 INFO nova.compute.resource_tracker [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating resource usage from migration 8cb2fbbd-d3e9-4aa3-a7d8-6931faa17b05
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.563 185914 DEBUG nova.compute.resource_tracker [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Starting to track incoming migration 8cb2fbbd-d3e9-4aa3-a7d8-6931faa17b05 with flavor 24d59dff-ce99-4fd9-bd4e-4d0b2cf1ef53 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.576 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.643 185914 DEBUG nova.compute.provider_tree [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.661 185914 DEBUG nova.scheduler.client.report [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.699 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.699 185914 INFO nova.compute.manager [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Migrating
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.700 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.700 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.707 185914 INFO nova.compute.rpcapi [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Feb 16 13:21:12 compute-1 nova_compute[185910]: 2026-02-16 13:21:12.708 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:14 compute-1 nova_compute[185910]: 2026-02-16 13:21:14.097 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:14 compute-1 sshd-session[206831]: Accepted publickey for nova from 192.168.122.100 port 51254 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:21:14 compute-1 systemd-logind[821]: New session 28 of user nova.
Feb 16 13:21:14 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:21:14 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:21:14 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:21:14 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:21:14 compute-1 systemd[206835]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:21:14 compute-1 systemd[206835]: Queued start job for default target Main User Target.
Feb 16 13:21:14 compute-1 systemd[206835]: Created slice User Application Slice.
Feb 16 13:21:14 compute-1 systemd[206835]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:21:14 compute-1 systemd[206835]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:21:14 compute-1 systemd[206835]: Reached target Paths.
Feb 16 13:21:14 compute-1 systemd[206835]: Reached target Timers.
Feb 16 13:21:14 compute-1 systemd[206835]: Starting D-Bus User Message Bus Socket...
Feb 16 13:21:15 compute-1 systemd[206835]: Starting Create User's Volatile Files and Directories...
Feb 16 13:21:15 compute-1 systemd[206835]: Finished Create User's Volatile Files and Directories.
Feb 16 13:21:15 compute-1 systemd[206835]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:21:15 compute-1 systemd[206835]: Reached target Sockets.
Feb 16 13:21:15 compute-1 systemd[206835]: Reached target Basic System.
Feb 16 13:21:15 compute-1 systemd[206835]: Reached target Main User Target.
Feb 16 13:21:15 compute-1 systemd[206835]: Startup finished in 221ms.
Feb 16 13:21:15 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:21:15 compute-1 systemd[1]: Started Session 28 of User nova.
Feb 16 13:21:15 compute-1 sshd-session[206831]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:21:15 compute-1 sshd-session[206850]: Received disconnect from 192.168.122.100 port 51254:11: disconnected by user
Feb 16 13:21:15 compute-1 sshd-session[206850]: Disconnected from user nova 192.168.122.100 port 51254
Feb 16 13:21:15 compute-1 sshd-session[206831]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:21:15 compute-1 systemd-logind[821]: Session 28 logged out. Waiting for processes to exit.
Feb 16 13:21:15 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Feb 16 13:21:15 compute-1 systemd-logind[821]: Removed session 28.
Feb 16 13:21:15 compute-1 sshd-session[206852]: Accepted publickey for nova from 192.168.122.100 port 51258 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:21:15 compute-1 systemd-logind[821]: New session 30 of user nova.
Feb 16 13:21:15 compute-1 systemd[1]: Started Session 30 of User nova.
Feb 16 13:21:15 compute-1 sshd-session[206852]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:21:15 compute-1 sshd-session[206855]: Received disconnect from 192.168.122.100 port 51258:11: disconnected by user
Feb 16 13:21:15 compute-1 sshd-session[206855]: Disconnected from user nova 192.168.122.100 port 51258
Feb 16 13:21:15 compute-1 sshd-session[206852]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:21:15 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Feb 16 13:21:15 compute-1 systemd-logind[821]: Session 30 logged out. Waiting for processes to exit.
Feb 16 13:21:15 compute-1 systemd-logind[821]: Removed session 30.
Feb 16 13:21:15 compute-1 ovn_controller[96285]: 2026-02-16T13:21:15Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:be:da:c0 10.100.0.10
Feb 16 13:21:15 compute-1 ovn_controller[96285]: 2026-02-16T13:21:15Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:be:da:c0 10.100.0.10
Feb 16 13:21:17 compute-1 nova_compute[185910]: 2026-02-16 13:21:17.615 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:17 compute-1 nova_compute[185910]: 2026-02-16 13:21:17.905 185914 DEBUG nova.compute.manager [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-unplugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:17 compute-1 nova_compute[185910]: 2026-02-16 13:21:17.906 185914 DEBUG oslo_concurrency.lockutils [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:17 compute-1 nova_compute[185910]: 2026-02-16 13:21:17.907 185914 DEBUG oslo_concurrency.lockutils [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:17 compute-1 nova_compute[185910]: 2026-02-16 13:21:17.907 185914 DEBUG oslo_concurrency.lockutils [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:17 compute-1 nova_compute[185910]: 2026-02-16 13:21:17.907 185914 DEBUG nova.compute.manager [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-unplugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:17 compute-1 nova_compute[185910]: 2026-02-16 13:21:17.907 185914 WARNING nova.compute.manager [req-0cc544bd-12e7-4651-9796-70df5b44fe8f req-f300d51b-8c0b-419a-9fc8-1f92732bacd0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-unplugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state active and task_state resize_migrating.
Feb 16 13:21:18 compute-1 sshd-session[206858]: Accepted publickey for nova from 192.168.122.100 port 51264 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:21:18 compute-1 systemd-logind[821]: New session 31 of user nova.
Feb 16 13:21:18 compute-1 systemd[1]: Started Session 31 of User nova.
Feb 16 13:21:18 compute-1 sshd-session[206858]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:21:18 compute-1 sshd-session[206861]: Received disconnect from 192.168.122.100 port 51264:11: disconnected by user
Feb 16 13:21:18 compute-1 sshd-session[206861]: Disconnected from user nova 192.168.122.100 port 51264
Feb 16 13:21:18 compute-1 sshd-session[206858]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:21:18 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Feb 16 13:21:18 compute-1 systemd-logind[821]: Session 31 logged out. Waiting for processes to exit.
Feb 16 13:21:18 compute-1 systemd-logind[821]: Removed session 31.
Feb 16 13:21:19 compute-1 sshd-session[206863]: Accepted publickey for nova from 192.168.122.100 port 51268 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:21:19 compute-1 systemd-logind[821]: New session 32 of user nova.
Feb 16 13:21:19 compute-1 systemd[1]: Started Session 32 of User nova.
Feb 16 13:21:19 compute-1 sshd-session[206863]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:21:19 compute-1 nova_compute[185910]: 2026-02-16 13:21:19.100 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:19 compute-1 sshd-session[206866]: Received disconnect from 192.168.122.100 port 51268:11: disconnected by user
Feb 16 13:21:19 compute-1 sshd-session[206866]: Disconnected from user nova 192.168.122.100 port 51268
Feb 16 13:21:19 compute-1 sshd-session[206863]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:21:19 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Feb 16 13:21:19 compute-1 systemd-logind[821]: Session 32 logged out. Waiting for processes to exit.
Feb 16 13:21:19 compute-1 systemd-logind[821]: Removed session 32.
Feb 16 13:21:19 compute-1 sshd-session[206868]: Accepted publickey for nova from 192.168.122.100 port 51280 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:21:19 compute-1 systemd-logind[821]: New session 33 of user nova.
Feb 16 13:21:19 compute-1 systemd[1]: Started Session 33 of User nova.
Feb 16 13:21:19 compute-1 sshd-session[206868]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:21:19 compute-1 sshd-session[206871]: Received disconnect from 192.168.122.100 port 51280:11: disconnected by user
Feb 16 13:21:19 compute-1 sshd-session[206871]: Disconnected from user nova 192.168.122.100 port 51280
Feb 16 13:21:19 compute-1 sshd-session[206868]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:21:19 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Feb 16 13:21:19 compute-1 systemd-logind[821]: Session 33 logged out. Waiting for processes to exit.
Feb 16 13:21:19 compute-1 systemd-logind[821]: Removed session 33.
Feb 16 13:21:19 compute-1 openstack_network_exporter[198096]: ERROR   13:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:21:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:21:19 compute-1 openstack_network_exporter[198096]: ERROR   13:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:21:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:21:20 compute-1 nova_compute[185910]: 2026-02-16 13:21:20.037 185914 DEBUG nova.compute.manager [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:20 compute-1 nova_compute[185910]: 2026-02-16 13:21:20.038 185914 DEBUG oslo_concurrency.lockutils [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:20 compute-1 nova_compute[185910]: 2026-02-16 13:21:20.038 185914 DEBUG oslo_concurrency.lockutils [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:20 compute-1 nova_compute[185910]: 2026-02-16 13:21:20.039 185914 DEBUG oslo_concurrency.lockutils [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:20 compute-1 nova_compute[185910]: 2026-02-16 13:21:20.039 185914 DEBUG nova.compute.manager [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:20 compute-1 nova_compute[185910]: 2026-02-16 13:21:20.039 185914 WARNING nova.compute.manager [req-5be759d1-2a6d-4735-b9e9-a00b8edde096 req-b75d7c02-feab-4f24-9047-1d64e91e106d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state active and task_state resize_migrated.
Feb 16 13:21:20 compute-1 nova_compute[185910]: 2026-02-16 13:21:20.823 185914 INFO nova.network.neutron [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating port b0642d70-aac9-4a19-b18b-6f6a914d307a with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}
Feb 16 13:21:21 compute-1 podman[206875]: 2026-02-16 13:21:21.077241052 +0000 UTC m=+0.206434836 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z)
Feb 16 13:21:21 compute-1 podman[206876]: 2026-02-16 13:21:21.123017304 +0000 UTC m=+0.249111796 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 16 13:21:21 compute-1 sshd-session[206873]: Connection closed by authenticating user root 188.166.42.159 port 41326 [preauth]
Feb 16 13:21:21 compute-1 nova_compute[185910]: 2026-02-16 13:21:21.819 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:21 compute-1 nova_compute[185910]: 2026-02-16 13:21:21.820 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:21 compute-1 nova_compute[185910]: 2026-02-16 13:21:21.820 185914 DEBUG nova.network.neutron [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:21:22 compute-1 nova_compute[185910]: 2026-02-16 13:21:22.152 185914 DEBUG nova.compute.manager [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-changed-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:22 compute-1 nova_compute[185910]: 2026-02-16 13:21:22.153 185914 DEBUG nova.compute.manager [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Refreshing instance network info cache due to event network-changed-b0642d70-aac9-4a19-b18b-6f6a914d307a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:21:22 compute-1 nova_compute[185910]: 2026-02-16 13:21:22.153 185914 DEBUG oslo_concurrency.lockutils [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:21:22 compute-1 nova_compute[185910]: 2026-02-16 13:21:22.618 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.470 185914 DEBUG nova.network.neutron [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.493 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.498 185914 DEBUG oslo_concurrency.lockutils [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.498 185914 DEBUG nova.network.neutron [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Refreshing network info cache for port b0642d70-aac9-4a19-b18b-6f6a914d307a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.599 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.602 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.602 185914 INFO nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Creating image(s)
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.603 185914 DEBUG nova.objects.instance [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.627 185914 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.674 185914 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.675 185914 DEBUG nova.virt.disk.api [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.675 185914 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.738 185914 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.739 185914 DEBUG nova.virt.disk.api [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.793 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.794 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Ensure instance console log exists: /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.794 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.795 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.795 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.798 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Start _get_guest_xml network_info=[{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "vif_mac": "fa:16:3e:b1:7c:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.803 185914 WARNING nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.810 185914 DEBUG nova.virt.libvirt.host [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.811 185914 DEBUG nova.virt.libvirt.host [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.819 185914 DEBUG nova.virt.libvirt.host [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.820 185914 DEBUG nova.virt.libvirt.host [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.822 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.822 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:17:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='24d59dff-ce99-4fd9-bd4e-4d0b2cf1ef53',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.823 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.823 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.824 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.824 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.824 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.824 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.825 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.825 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.825 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.825 185914 DEBUG nova.virt.hardware [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.826 185914 DEBUG nova.objects.instance [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.848 185914 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.896 185914 DEBUG oslo_concurrency.processutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.897 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.897 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.898 185914 DEBUG oslo_concurrency.lockutils [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.899 185914 DEBUG nova.virt.libvirt.vif [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-727824786',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-727824786',id=1,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:20:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-7k7vpckb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:21:19Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=934dfad2-33a3-44dd-82c8-0b913e89cb8e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "vif_mac": "fa:16:3e:b1:7c:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.899 185914 DEBUG nova.network.os_vif_util [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "vif_mac": "fa:16:3e:b1:7c:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.900 185914 DEBUG nova.network.os_vif_util [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.903 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <uuid>934dfad2-33a3-44dd-82c8-0b913e89cb8e</uuid>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <name>instance-00000001</name>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <memory>196608</memory>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-727824786</nova:name>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:21:23</nova:creationTime>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <nova:flavor name="m1.micro">
Feb 16 13:21:23 compute-1 nova_compute[185910]:         <nova:memory>192</nova:memory>
Feb 16 13:21:23 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:21:23 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:21:23 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:21:23 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:21:23 compute-1 nova_compute[185910]:         <nova:user uuid="53b5045c5aaf4a7d8d84dce2ac4aa424">tempest-TestExecuteActionsViaActuator-1504038973-project-member</nova:user>
Feb 16 13:21:23 compute-1 nova_compute[185910]:         <nova:project uuid="b5e0321e3a614b62a46eef7fb2e737ff">tempest-TestExecuteActionsViaActuator-1504038973</nova:project>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:21:23 compute-1 nova_compute[185910]:         <nova:port uuid="b0642d70-aac9-4a19-b18b-6f6a914d307a">
Feb 16 13:21:23 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <system>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <entry name="serial">934dfad2-33a3-44dd-82c8-0b913e89cb8e</entry>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <entry name="uuid">934dfad2-33a3-44dd-82c8-0b913e89cb8e</entry>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     </system>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <os>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   </os>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <features>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   </features>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk.config"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:b1:7c:d9"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <target dev="tapb0642d70-aa"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/console.log" append="off"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <video>
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     </video>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:21:23 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:21:23 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:21:23 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:21:23 compute-1 nova_compute[185910]: </domain>
Feb 16 13:21:23 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.904 185914 DEBUG nova.virt.libvirt.vif [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-727824786',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-727824786',id=1,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:20:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-7k7vpckb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:21:19Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=934dfad2-33a3-44dd-82c8-0b913e89cb8e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "vif_mac": "fa:16:3e:b1:7c:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.904 185914 DEBUG nova.network.os_vif_util [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "vif_mac": "fa:16:3e:b1:7c:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.904 185914 DEBUG nova.network.os_vif_util [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.905 185914 DEBUG os_vif [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.905 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.906 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.906 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.909 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.910 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb0642d70-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.910 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb0642d70-aa, col_values=(('external_ids', {'iface-id': 'b0642d70-aac9-4a19-b18b-6f6a914d307a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:7c:d9', 'vm-uuid': '934dfad2-33a3-44dd-82c8-0b913e89cb8e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.912 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:23 compute-1 NetworkManager[56388]: <info>  [1771248083.9133] manager: (tapb0642d70-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.916 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.918 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:23 compute-1 nova_compute[185910]: 2026-02-16 13:21:23.920 185914 INFO os_vif [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa')
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.227 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.228 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.228 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No VIF found with MAC fa:16:3e:b1:7c:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.229 185914 INFO nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Using config drive
Feb 16 13:21:24 compute-1 kernel: tapb0642d70-aa: entered promiscuous mode
Feb 16 13:21:24 compute-1 NetworkManager[56388]: <info>  [1771248084.2748] manager: (tapb0642d70-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Feb 16 13:21:24 compute-1 ovn_controller[96285]: 2026-02-16T13:21:24Z|00032|binding|INFO|Claiming lport b0642d70-aac9-4a19-b18b-6f6a914d307a for this chassis.
Feb 16 13:21:24 compute-1 ovn_controller[96285]: 2026-02-16T13:21:24Z|00033|binding|INFO|b0642d70-aac9-4a19-b18b-6f6a914d307a: Claiming fa:16:3e:b1:7c:d9 10.100.0.6
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.292 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:24 compute-1 ovn_controller[96285]: 2026-02-16T13:21:24Z|00034|binding|INFO|Setting lport b0642d70-aac9-4a19-b18b-6f6a914d307a ovn-installed in OVS
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.299 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:24 compute-1 systemd-udevd[206937]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:21:24 compute-1 ovn_controller[96285]: 2026-02-16T13:21:24Z|00035|binding|INFO|Setting lport b0642d70-aac9-4a19-b18b-6f6a914d307a up in Southbound
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.313 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:7c:d9 10.100.0.6'], port_security=['fa:16:3e:b1:7c:d9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '934dfad2-33a3-44dd-82c8-0b913e89cb8e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '6', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=b0642d70-aac9-4a19-b18b-6f6a914d307a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.315 105573 INFO neutron.agent.ovn.metadata.agent [-] Port b0642d70-aac9-4a19-b18b-6f6a914d307a in datapath a6199784-1742-41a7-9152-bb54abb7bef1 bound to our chassis
Feb 16 13:21:24 compute-1 systemd-machined[155419]: New machine qemu-2-instance-00000001.
Feb 16 13:21:24 compute-1 NetworkManager[56388]: <info>  [1771248084.3178] device (tapb0642d70-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.317 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:21:24 compute-1 NetworkManager[56388]: <info>  [1771248084.3184] device (tapb0642d70-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:21:24 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000001.
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.332 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9a05f9-72cc-47d3-9624-e7626a826a8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.363 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[91e06d9f-2455-4dcd-9818-050e7132007b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.368 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[67e6c2a6-f01c-4308-95f3-63983a811288]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.394 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[570ee00f-c5b5-40cc-b291-05eea056172a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.410 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a039f60e-a439-4879-ab59-1a2e6786bcdc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415002, 'reachable_time': 21372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206954, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.424 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ef767533-f3dd-4a13-9f35-9da56df3b677]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415011, 'tstamp': 415011}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206955, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415012, 'tstamp': 415012}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206955, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.425 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.427 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.430 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.430 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.431 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:21:24 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:21:24.431 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.616 185914 DEBUG nova.compute.manager [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.618 185914 DEBUG oslo_concurrency.lockutils [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.618 185914 DEBUG oslo_concurrency.lockutils [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.618 185914 DEBUG oslo_concurrency.lockutils [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.618 185914 DEBUG nova.compute.manager [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:24 compute-1 nova_compute[185910]: 2026-02-16 13:21:24.618 185914 WARNING nova.compute.manager [req-093cdede-9796-4280-a983-39904a79ac64 req-f214749d-a9ba-451e-b779-8c4fd22d9759 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state active and task_state resize_finish.
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.020 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248085.0194736, 934dfad2-33a3-44dd-82c8-0b913e89cb8e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.021 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] VM Resumed (Lifecycle Event)
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.024 185914 DEBUG nova.compute.manager [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.030 185914 INFO nova.virt.libvirt.driver [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance running successfully.
Feb 16 13:21:25 compute-1 virtqemud[185025]: argument unsupported: QEMU guest agent is not configured
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.033 185914 DEBUG nova.virt.libvirt.guest [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.034 185914 DEBUG nova.virt.libvirt.driver [None req-70f5ec0a-04de-4d96-8596-c95bfa428503 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.045 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.051 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.106 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] During sync_power_state the instance has a pending task (resize_finish). Skip.
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.107 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248085.019598, 934dfad2-33a3-44dd-82c8-0b913e89cb8e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.107 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] VM Started (Lifecycle Event)
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.133 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.137 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.530 185914 DEBUG nova.network.neutron [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updated VIF entry in instance network info cache for port b0642d70-aac9-4a19-b18b-6f6a914d307a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.530 185914 DEBUG nova.network.neutron [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:21:25 compute-1 nova_compute[185910]: 2026-02-16 13:21:25.551 185914 DEBUG oslo_concurrency.lockutils [req-58a8cfd8-48a5-42e2-8e4a-bdf02da22039 req-f71dad49-aa73-47a2-8969-ed90f537276b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:21:26 compute-1 podman[206964]: 2026-02-16 13:21:26.184965937 +0000 UTC m=+0.224603467 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:21:26 compute-1 nova_compute[185910]: 2026-02-16 13:21:26.731 185914 DEBUG nova.compute.manager [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:21:26 compute-1 nova_compute[185910]: 2026-02-16 13:21:26.732 185914 DEBUG oslo_concurrency.lockutils [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:21:26 compute-1 nova_compute[185910]: 2026-02-16 13:21:26.732 185914 DEBUG oslo_concurrency.lockutils [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:21:26 compute-1 nova_compute[185910]: 2026-02-16 13:21:26.732 185914 DEBUG oslo_concurrency.lockutils [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:21:26 compute-1 nova_compute[185910]: 2026-02-16 13:21:26.732 185914 DEBUG nova.compute.manager [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:21:26 compute-1 nova_compute[185910]: 2026-02-16 13:21:26.732 185914 WARNING nova.compute.manager [req-a2e154dd-49ab-4823-8a62-6474b9871349 req-41f02f5e-4b3d-4324-a2de-0d126dc9e55d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state resized and task_state None.
Feb 16 13:21:27 compute-1 nova_compute[185910]: 2026-02-16 13:21:27.666 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:28 compute-1 nova_compute[185910]: 2026-02-16 13:21:28.912 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:29 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:21:29 compute-1 systemd[206835]: Activating special unit Exit the Session...
Feb 16 13:21:29 compute-1 systemd[206835]: Stopped target Main User Target.
Feb 16 13:21:29 compute-1 systemd[206835]: Stopped target Basic System.
Feb 16 13:21:29 compute-1 systemd[206835]: Stopped target Paths.
Feb 16 13:21:29 compute-1 systemd[206835]: Stopped target Sockets.
Feb 16 13:21:29 compute-1 systemd[206835]: Stopped target Timers.
Feb 16 13:21:29 compute-1 systemd[206835]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:21:29 compute-1 systemd[206835]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:21:29 compute-1 systemd[206835]: Closed D-Bus User Message Bus Socket.
Feb 16 13:21:29 compute-1 systemd[206835]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:21:29 compute-1 systemd[206835]: Removed slice User Application Slice.
Feb 16 13:21:29 compute-1 systemd[206835]: Reached target Shutdown.
Feb 16 13:21:29 compute-1 systemd[206835]: Finished Exit the Session.
Feb 16 13:21:29 compute-1 systemd[206835]: Reached target Exit the Session.
Feb 16 13:21:29 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:21:29 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:21:29 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:21:29 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:21:29 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:21:29 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:21:29 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:21:32 compute-1 nova_compute[185910]: 2026-02-16 13:21:32.668 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:33 compute-1 nova_compute[185910]: 2026-02-16 13:21:33.914 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:34 compute-1 podman[206992]: 2026-02-16 13:21:34.050892578 +0000 UTC m=+0.092018866 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:21:35 compute-1 podman[195236]: time="2026-02-16T13:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:21:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17243 "" "Go-http-client/1.1"
Feb 16 13:21:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2626 "" "Go-http-client/1.1"
Feb 16 13:21:37 compute-1 nova_compute[185910]: 2026-02-16 13:21:37.703 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:38 compute-1 nova_compute[185910]: 2026-02-16 13:21:38.916 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:39 compute-1 ovn_controller[96285]: 2026-02-16T13:21:39Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:7c:d9 10.100.0.6
Feb 16 13:21:42 compute-1 sshd-session[207031]: Invalid user sol from 2.57.122.210 port 41774
Feb 16 13:21:42 compute-1 sshd-session[207031]: Connection closed by invalid user sol 2.57.122.210 port 41774 [preauth]
Feb 16 13:21:42 compute-1 nova_compute[185910]: 2026-02-16 13:21:42.704 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:43 compute-1 nova_compute[185910]: 2026-02-16 13:21:43.918 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:47 compute-1 nova_compute[185910]: 2026-02-16 13:21:47.706 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:48 compute-1 sshd-session[207033]: Connection closed by authenticating user root 146.190.226.24 port 53098 [preauth]
Feb 16 13:21:48 compute-1 nova_compute[185910]: 2026-02-16 13:21:48.919 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:49 compute-1 openstack_network_exporter[198096]: ERROR   13:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:21:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:21:49 compute-1 openstack_network_exporter[198096]: ERROR   13:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:21:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:21:51 compute-1 podman[207036]: 2026-02-16 13:21:51.920667437 +0000 UTC m=+0.052842309 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 13:21:51 compute-1 podman[207035]: 2026-02-16 13:21:51.935178891 +0000 UTC m=+0.067274950 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Feb 16 13:21:52 compute-1 nova_compute[185910]: 2026-02-16 13:21:52.746 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:53 compute-1 nova_compute[185910]: 2026-02-16 13:21:53.922 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:56 compute-1 podman[207076]: 2026-02-16 13:21:56.926977357 +0000 UTC m=+0.072281954 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 16 13:21:57 compute-1 nova_compute[185910]: 2026-02-16 13:21:57.748 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:21:58 compute-1 nova_compute[185910]: 2026-02-16 13:21:58.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:21:58 compute-1 nova_compute[185910]: 2026-02-16 13:21:58.924 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:00 compute-1 nova_compute[185910]: 2026-02-16 13:22:00.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:00 compute-1 nova_compute[185910]: 2026-02-16 13:22:00.656 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "070628d7-dd99-487b-be76-d66c0d82ebc3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:00 compute-1 nova_compute[185910]: 2026-02-16 13:22:00.656 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:00 compute-1 nova_compute[185910]: 2026-02-16 13:22:00.736 185914 DEBUG nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:22:00 compute-1 nova_compute[185910]: 2026-02-16 13:22:00.965 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:00 compute-1 nova_compute[185910]: 2026-02-16 13:22:00.966 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:00 compute-1 nova_compute[185910]: 2026-02-16 13:22:00.974 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:22:00 compute-1 nova_compute[185910]: 2026-02-16 13:22:00.975 185914 INFO nova.compute.claims [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.230 185914 DEBUG nova.compute.provider_tree [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.276 185914 DEBUG nova.scheduler.client.report [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.363 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.397s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.364 185914 DEBUG nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.454 185914 DEBUG nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.455 185914 DEBUG nova.network.neutron [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.486 185914 INFO nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.524 185914 DEBUG nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.704 185914 DEBUG nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.706 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.706 185914 INFO nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Creating image(s)
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.707 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "/var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.707 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.708 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "/var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.721 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.787 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.788 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.789 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.799 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.858 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.859 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.895 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.897 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.897 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.925 185914 DEBUG nova.policy [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53b5045c5aaf4a7d8d84dce2ac4aa424', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.946 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.947 185914 DEBUG nova.virt.disk.api [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Checking if we can resize image /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.947 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.990 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.991 185914 DEBUG nova.virt.disk.api [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Cannot resize image /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:22:01 compute-1 nova_compute[185910]: 2026-02-16 13:22:01.991 185914 DEBUG nova.objects.instance [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'migration_context' on Instance uuid 070628d7-dd99-487b-be76-d66c0d82ebc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.017 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.017 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Ensure instance console log exists: /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.018 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.018 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.018 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.643 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:02.643 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:22:02 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:02.645 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.665 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.666 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.666 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.666 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.749 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.758 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.810 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.812 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.859 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.864 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.911 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.912 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:02 compute-1 nova_compute[185910]: 2026-02-16 13:22:02.958 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.005 185914 DEBUG nova.network.neutron [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Successfully created port: ff2861e4-b2b3-4f21-8ca5-be850cbd522e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.091 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.092 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5516MB free_disk=73.16974639892578GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.092 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.093 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:03.328 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:03.329 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:03.330 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.375 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 5021a07d-59d2-49c7-b92f-0c25c5dc1222 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.375 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 934dfad2-33a3-44dd-82c8-0b913e89cb8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.375 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 070628d7-dd99-487b-be76-d66c0d82ebc3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.375 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.376 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.699 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:22:03 compute-1 nova_compute[185910]: 2026-02-16 13:22:03.926 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:04 compute-1 podman[207130]: 2026-02-16 13:22:04.912927777 +0000 UTC m=+0.051193136 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:22:05 compute-1 podman[195236]: time="2026-02-16T13:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:22:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17243 "" "Go-http-client/1.1"
Feb 16 13:22:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2624 "" "Go-http-client/1.1"
Feb 16 13:22:06 compute-1 nova_compute[185910]: 2026-02-16 13:22:06.780 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:22:06 compute-1 nova_compute[185910]: 2026-02-16 13:22:06.946 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:22:06 compute-1 nova_compute[185910]: 2026-02-16 13:22:06.947 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.854s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:07 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:07.647 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:07 compute-1 nova_compute[185910]: 2026-02-16 13:22:07.751 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:07 compute-1 nova_compute[185910]: 2026-02-16 13:22:07.858 185914 DEBUG nova.network.neutron [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Successfully updated port: ff2861e4-b2b3-4f21-8ca5-be850cbd522e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:22:07 compute-1 nova_compute[185910]: 2026-02-16 13:22:07.904 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "refresh_cache-070628d7-dd99-487b-be76-d66c0d82ebc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:22:07 compute-1 nova_compute[185910]: 2026-02-16 13:22:07.904 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquired lock "refresh_cache-070628d7-dd99-487b-be76-d66c0d82ebc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:22:07 compute-1 nova_compute[185910]: 2026-02-16 13:22:07.904 185914 DEBUG nova.network.neutron [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:22:07 compute-1 nova_compute[185910]: 2026-02-16 13:22:07.947 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:07 compute-1 nova_compute[185910]: 2026-02-16 13:22:07.947 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:22:07 compute-1 nova_compute[185910]: 2026-02-16 13:22:07.948 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:22:07 compute-1 nova_compute[185910]: 2026-02-16 13:22:07.981 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 16 13:22:08 compute-1 nova_compute[185910]: 2026-02-16 13:22:08.741 185914 DEBUG nova.compute.manager [req-23414210-94e4-461b-9b3d-f342aa3be6ff req-98712026-8353-4d1c-aebe-12dceb7754ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Received event network-changed-ff2861e4-b2b3-4f21-8ca5-be850cbd522e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:22:08 compute-1 nova_compute[185910]: 2026-02-16 13:22:08.742 185914 DEBUG nova.compute.manager [req-23414210-94e4-461b-9b3d-f342aa3be6ff req-98712026-8353-4d1c-aebe-12dceb7754ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Refreshing instance network info cache due to event network-changed-ff2861e4-b2b3-4f21-8ca5-be850cbd522e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:22:08 compute-1 nova_compute[185910]: 2026-02-16 13:22:08.742 185914 DEBUG oslo_concurrency.lockutils [req-23414210-94e4-461b-9b3d-f342aa3be6ff req-98712026-8353-4d1c-aebe-12dceb7754ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-070628d7-dd99-487b-be76-d66c0d82ebc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:22:08 compute-1 nova_compute[185910]: 2026-02-16 13:22:08.831 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:22:08 compute-1 nova_compute[185910]: 2026-02-16 13:22:08.832 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:22:08 compute-1 nova_compute[185910]: 2026-02-16 13:22:08.832 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:22:08 compute-1 nova_compute[185910]: 2026-02-16 13:22:08.832 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:22:08 compute-1 nova_compute[185910]: 2026-02-16 13:22:08.929 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:09 compute-1 nova_compute[185910]: 2026-02-16 13:22:09.820 185914 DEBUG nova.network.neutron [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:22:12 compute-1 nova_compute[185910]: 2026-02-16 13:22:12.753 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:13 compute-1 nova_compute[185910]: 2026-02-16 13:22:13.932 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.077 185914 DEBUG nova.network.neutron [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Updating instance_info_cache with network_info: [{"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.490 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.636 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.636 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.637 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.637 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.649 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Releasing lock "refresh_cache-070628d7-dd99-487b-be76-d66c0d82ebc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.649 185914 DEBUG nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Instance network_info: |[{"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.650 185914 DEBUG oslo_concurrency.lockutils [req-23414210-94e4-461b-9b3d-f342aa3be6ff req-98712026-8353-4d1c-aebe-12dceb7754ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-070628d7-dd99-487b-be76-d66c0d82ebc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.650 185914 DEBUG nova.network.neutron [req-23414210-94e4-461b-9b3d-f342aa3be6ff req-98712026-8353-4d1c-aebe-12dceb7754ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Refreshing network info cache for port ff2861e4-b2b3-4f21-8ca5-be850cbd522e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.653 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Start _get_guest_xml network_info=[{"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.657 185914 WARNING nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.668 185914 DEBUG nova.virt.libvirt.host [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.669 185914 DEBUG nova.virt.libvirt.host [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.675 185914 DEBUG nova.virt.libvirt.host [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.675 185914 DEBUG nova.virt.libvirt.host [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.677 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.677 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.678 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.678 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.678 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.678 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.679 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.679 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.679 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.679 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.680 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.680 185914 DEBUG nova.virt.hardware [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.683 185914 DEBUG nova.virt.libvirt.vif [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:21:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-337356765',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-337356765',id=4,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-5we1w2kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:22:01Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=070628d7-dd99-487b-be76-d66c0d82ebc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.684 185914 DEBUG nova.network.os_vif_util [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.685 185914 DEBUG nova.network.os_vif_util [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:b8:54,bridge_name='br-int',has_traffic_filtering=True,id=ff2861e4-b2b3-4f21-8ca5-be850cbd522e,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2861e4-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.687 185914 DEBUG nova.objects.instance [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'pci_devices' on Instance uuid 070628d7-dd99-487b-be76-d66c0d82ebc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.718 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <uuid>070628d7-dd99-487b-be76-d66c0d82ebc3</uuid>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <name>instance-00000004</name>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-337356765</nova:name>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:22:14</nova:creationTime>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:22:14 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:22:14 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:22:14 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:22:14 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:22:14 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:22:14 compute-1 nova_compute[185910]:         <nova:user uuid="53b5045c5aaf4a7d8d84dce2ac4aa424">tempest-TestExecuteActionsViaActuator-1504038973-project-member</nova:user>
Feb 16 13:22:14 compute-1 nova_compute[185910]:         <nova:project uuid="b5e0321e3a614b62a46eef7fb2e737ff">tempest-TestExecuteActionsViaActuator-1504038973</nova:project>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:22:14 compute-1 nova_compute[185910]:         <nova:port uuid="ff2861e4-b2b3-4f21-8ca5-be850cbd522e">
Feb 16 13:22:14 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <system>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <entry name="serial">070628d7-dd99-487b-be76-d66c0d82ebc3</entry>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <entry name="uuid">070628d7-dd99-487b-be76-d66c0d82ebc3</entry>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     </system>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <os>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   </os>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <features>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   </features>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk.config"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:ec:b8:54"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <target dev="tapff2861e4-b2"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/console.log" append="off"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <video>
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     </video>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:22:14 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:22:14 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:22:14 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:22:14 compute-1 nova_compute[185910]: </domain>
Feb 16 13:22:14 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.719 185914 DEBUG nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Preparing to wait for external event network-vif-plugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.719 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.720 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.720 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.721 185914 DEBUG nova.virt.libvirt.vif [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:21:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-337356765',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-337356765',id=4,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-5we1w2kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:22:01Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=070628d7-dd99-487b-be76-d66c0d82ebc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.721 185914 DEBUG nova.network.os_vif_util [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.722 185914 DEBUG nova.network.os_vif_util [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:b8:54,bridge_name='br-int',has_traffic_filtering=True,id=ff2861e4-b2b3-4f21-8ca5-be850cbd522e,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2861e4-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.722 185914 DEBUG os_vif [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:b8:54,bridge_name='br-int',has_traffic_filtering=True,id=ff2861e4-b2b3-4f21-8ca5-be850cbd522e,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2861e4-b2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.723 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.723 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.724 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.727 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.727 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff2861e4-b2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.728 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapff2861e4-b2, col_values=(('external_ids', {'iface-id': 'ff2861e4-b2b3-4f21-8ca5-be850cbd522e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:b8:54', 'vm-uuid': '070628d7-dd99-487b-be76-d66c0d82ebc3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:14 compute-1 NetworkManager[56388]: <info>  [1771248134.7308] manager: (tapff2861e4-b2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.730 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.733 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.736 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.737 185914 INFO os_vif [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:b8:54,bridge_name='br-int',has_traffic_filtering=True,id=ff2861e4-b2b3-4f21-8ca5-be850cbd522e,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2861e4-b2')
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.804 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.805 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.805 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] No VIF found with MAC fa:16:3e:ec:b8:54, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:22:14 compute-1 nova_compute[185910]: 2026-02-16 13:22:14.805 185914 INFO nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Using config drive
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.016 185914 INFO nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Creating config drive at /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk.config
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.020 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6xbiuvm1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.145 185914 DEBUG oslo_concurrency.processutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6xbiuvm1" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:22:17 compute-1 kernel: tapff2861e4-b2: entered promiscuous mode
Feb 16 13:22:17 compute-1 NetworkManager[56388]: <info>  [1771248137.2066] manager: (tapff2861e4-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.207 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:17 compute-1 ovn_controller[96285]: 2026-02-16T13:22:17Z|00036|binding|INFO|Claiming lport ff2861e4-b2b3-4f21-8ca5-be850cbd522e for this chassis.
Feb 16 13:22:17 compute-1 ovn_controller[96285]: 2026-02-16T13:22:17Z|00037|binding|INFO|ff2861e4-b2b3-4f21-8ca5-be850cbd522e: Claiming fa:16:3e:ec:b8:54 10.100.0.14
Feb 16 13:22:17 compute-1 ovn_controller[96285]: 2026-02-16T13:22:17Z|00038|binding|INFO|Setting lport ff2861e4-b2b3-4f21-8ca5-be850cbd522e ovn-installed in OVS
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.215 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.216 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:17 compute-1 systemd-udevd[207174]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:22:17 compute-1 systemd-machined[155419]: New machine qemu-3-instance-00000004.
Feb 16 13:22:17 compute-1 NetworkManager[56388]: <info>  [1771248137.2474] device (tapff2861e4-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:22:17 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Feb 16 13:22:17 compute-1 NetworkManager[56388]: <info>  [1771248137.2486] device (tapff2861e4-b2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:22:17 compute-1 ovn_controller[96285]: 2026-02-16T13:22:17Z|00039|binding|INFO|Setting lport ff2861e4-b2b3-4f21-8ca5-be850cbd522e up in Southbound
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.532 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:b8:54 10.100.0.14'], port_security=['fa:16:3e:ec:b8:54 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '070628d7-dd99-487b-be76-d66c0d82ebc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '2', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=ff2861e4-b2b3-4f21-8ca5-be850cbd522e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.535 105573 INFO neutron.agent.ovn.metadata.agent [-] Port ff2861e4-b2b3-4f21-8ca5-be850cbd522e in datapath a6199784-1742-41a7-9152-bb54abb7bef1 bound to our chassis
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.537 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.550 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f83f09dc-8fba-48f8-b848-ea934cb88193]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.571 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[49ca922b-de1e-4edd-963f-1728b0b8591a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.574 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8e0ba0-347d-4da0-a284-5efcc4e7fea4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.593 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[726784cd-3076-441a-b160-4e65191d90b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.608 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc48b0d-86e7-40c0-a338-fbc63fb2b9cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415002, 'reachable_time': 21372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207197, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.620 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[84083b91-717c-4443-9685-f325a8e62136]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415011, 'tstamp': 415011}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207198, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415012, 'tstamp': 415012}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207198, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.621 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.693 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.695 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.697 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.697 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:17.698 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.729 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248137.7283013, 070628d7-dd99-487b-be76-d66c0d82ebc3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.729 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] VM Started (Lifecycle Event)
Feb 16 13:22:17 compute-1 nova_compute[185910]: 2026-02-16 13:22:17.755 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:18 compute-1 nova_compute[185910]: 2026-02-16 13:22:18.268 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:22:18 compute-1 nova_compute[185910]: 2026-02-16 13:22:18.273 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248137.7289145, 070628d7-dd99-487b-be76-d66c0d82ebc3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:22:18 compute-1 nova_compute[185910]: 2026-02-16 13:22:18.273 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] VM Paused (Lifecycle Event)
Feb 16 13:22:18 compute-1 sshd-session[207183]: Connection closed by authenticating user root 188.166.42.159 port 39040 [preauth]
Feb 16 13:22:18 compute-1 nova_compute[185910]: 2026-02-16 13:22:18.313 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:22:18 compute-1 nova_compute[185910]: 2026-02-16 13:22:18.317 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:22:18 compute-1 nova_compute[185910]: 2026-02-16 13:22:18.365 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:22:19 compute-1 openstack_network_exporter[198096]: ERROR   13:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:22:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:22:19 compute-1 openstack_network_exporter[198096]: ERROR   13:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:22:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.523 185914 DEBUG nova.compute.manager [req-d80e073a-b7f4-4a2d-b931-852b0475a427 req-2aa5ff68-d7f9-45a8-b362-5eceab7a79c6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Received event network-vif-plugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.524 185914 DEBUG oslo_concurrency.lockutils [req-d80e073a-b7f4-4a2d-b931-852b0475a427 req-2aa5ff68-d7f9-45a8-b362-5eceab7a79c6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.524 185914 DEBUG oslo_concurrency.lockutils [req-d80e073a-b7f4-4a2d-b931-852b0475a427 req-2aa5ff68-d7f9-45a8-b362-5eceab7a79c6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.525 185914 DEBUG oslo_concurrency.lockutils [req-d80e073a-b7f4-4a2d-b931-852b0475a427 req-2aa5ff68-d7f9-45a8-b362-5eceab7a79c6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.525 185914 DEBUG nova.compute.manager [req-d80e073a-b7f4-4a2d-b931-852b0475a427 req-2aa5ff68-d7f9-45a8-b362-5eceab7a79c6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Processing event network-vif-plugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.525 185914 DEBUG nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.534 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.535 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248139.533272, 070628d7-dd99-487b-be76-d66c0d82ebc3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.535 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] VM Resumed (Lifecycle Event)
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.539 185914 INFO nova.virt.libvirt.driver [-] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Instance spawned successfully.
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.540 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.609 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.614 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.615 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.615 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.616 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.616 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.617 185914 DEBUG nova.virt.libvirt.driver [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.621 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.730 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:19 compute-1 nova_compute[185910]: 2026-02-16 13:22:19.794 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:22:20 compute-1 nova_compute[185910]: 2026-02-16 13:22:20.167 185914 INFO nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Took 18.46 seconds to spawn the instance on the hypervisor.
Feb 16 13:22:20 compute-1 nova_compute[185910]: 2026-02-16 13:22:20.168 185914 DEBUG nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:22:20 compute-1 nova_compute[185910]: 2026-02-16 13:22:20.632 185914 INFO nova.compute.manager [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Took 19.72 seconds to build instance.
Feb 16 13:22:20 compute-1 nova_compute[185910]: 2026-02-16 13:22:20.660 185914 DEBUG oslo_concurrency.lockutils [None req-408b9a39-daae-4243-ac51-647c83046f90 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:21 compute-1 nova_compute[185910]: 2026-02-16 13:22:21.067 185914 DEBUG nova.network.neutron [req-23414210-94e4-461b-9b3d-f342aa3be6ff req-98712026-8353-4d1c-aebe-12dceb7754ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Updated VIF entry in instance network info cache for port ff2861e4-b2b3-4f21-8ca5-be850cbd522e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:22:21 compute-1 nova_compute[185910]: 2026-02-16 13:22:21.069 185914 DEBUG nova.network.neutron [req-23414210-94e4-461b-9b3d-f342aa3be6ff req-98712026-8353-4d1c-aebe-12dceb7754ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Updating instance_info_cache with network_info: [{"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:22:21 compute-1 nova_compute[185910]: 2026-02-16 13:22:21.103 185914 DEBUG oslo_concurrency.lockutils [req-23414210-94e4-461b-9b3d-f342aa3be6ff req-98712026-8353-4d1c-aebe-12dceb7754ab faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-070628d7-dd99-487b-be76-d66c0d82ebc3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:22:21 compute-1 nova_compute[185910]: 2026-02-16 13:22:21.707 185914 DEBUG nova.compute.manager [req-ba88fb74-a625-474f-966c-a4f819cdd383 req-f81b9177-e2ff-4c64-a2f2-ff254cec034a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Received event network-vif-plugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:22:21 compute-1 nova_compute[185910]: 2026-02-16 13:22:21.708 185914 DEBUG oslo_concurrency.lockutils [req-ba88fb74-a625-474f-966c-a4f819cdd383 req-f81b9177-e2ff-4c64-a2f2-ff254cec034a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:22:21 compute-1 nova_compute[185910]: 2026-02-16 13:22:21.708 185914 DEBUG oslo_concurrency.lockutils [req-ba88fb74-a625-474f-966c-a4f819cdd383 req-f81b9177-e2ff-4c64-a2f2-ff254cec034a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:22:21 compute-1 nova_compute[185910]: 2026-02-16 13:22:21.708 185914 DEBUG oslo_concurrency.lockutils [req-ba88fb74-a625-474f-966c-a4f819cdd383 req-f81b9177-e2ff-4c64-a2f2-ff254cec034a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:22:21 compute-1 nova_compute[185910]: 2026-02-16 13:22:21.708 185914 DEBUG nova.compute.manager [req-ba88fb74-a625-474f-966c-a4f819cdd383 req-f81b9177-e2ff-4c64-a2f2-ff254cec034a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] No waiting events found dispatching network-vif-plugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:22:21 compute-1 nova_compute[185910]: 2026-02-16 13:22:21.709 185914 WARNING nova.compute.manager [req-ba88fb74-a625-474f-966c-a4f819cdd383 req-f81b9177-e2ff-4c64-a2f2-ff254cec034a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Received unexpected event network-vif-plugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e for instance with vm_state active and task_state None.
Feb 16 13:22:22 compute-1 nova_compute[185910]: 2026-02-16 13:22:22.757 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:22 compute-1 podman[207203]: 2026-02-16 13:22:22.92788468 +0000 UTC m=+0.066329417 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:22:22 compute-1 podman[207202]: 2026-02-16 13:22:22.953842717 +0000 UTC m=+0.085223507 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:22:24 compute-1 nova_compute[185910]: 2026-02-16 13:22:24.734 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:27 compute-1 nova_compute[185910]: 2026-02-16 13:22:27.758 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:27 compute-1 podman[207242]: 2026-02-16 13:22:27.974928538 +0000 UTC m=+0.108418561 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 13:22:29 compute-1 nova_compute[185910]: 2026-02-16 13:22:29.737 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:32 compute-1 nova_compute[185910]: 2026-02-16 13:22:32.759 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:33 compute-1 ovn_controller[96285]: 2026-02-16T13:22:33Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ec:b8:54 10.100.0.14
Feb 16 13:22:33 compute-1 ovn_controller[96285]: 2026-02-16T13:22:33Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ec:b8:54 10.100.0.14
Feb 16 13:22:34 compute-1 nova_compute[185910]: 2026-02-16 13:22:34.739 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:35 compute-1 podman[195236]: time="2026-02-16T13:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:22:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17243 "" "Go-http-client/1.1"
Feb 16 13:22:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2629 "" "Go-http-client/1.1"
Feb 16 13:22:36 compute-1 podman[207286]: 2026-02-16 13:22:36.000930118 +0000 UTC m=+0.123339786 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:22:37 compute-1 nova_compute[185910]: 2026-02-16 13:22:37.762 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:39 compute-1 nova_compute[185910]: 2026-02-16 13:22:39.742 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:42 compute-1 nova_compute[185910]: 2026-02-16 13:22:42.763 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:43 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:43.712 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:22:43 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:43.713 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:22:43 compute-1 nova_compute[185910]: 2026-02-16 13:22:43.731 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:44 compute-1 nova_compute[185910]: 2026-02-16 13:22:44.744 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:47 compute-1 nova_compute[185910]: 2026-02-16 13:22:47.766 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:49 compute-1 openstack_network_exporter[198096]: ERROR   13:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:22:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:22:49 compute-1 openstack_network_exporter[198096]: ERROR   13:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:22:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:22:49 compute-1 nova_compute[185910]: 2026-02-16 13:22:49.747 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:52 compute-1 nova_compute[185910]: 2026-02-16 13:22:52.767 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:53 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:22:53.716 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:22:53 compute-1 podman[207323]: 2026-02-16 13:22:53.923069261 +0000 UTC m=+0.058396567 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, vcs-type=git, name=ubi9/ubi-minimal, vendor=Red Hat, Inc.)
Feb 16 13:22:53 compute-1 podman[207324]: 2026-02-16 13:22:53.95402682 +0000 UTC m=+0.087104946 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:22:54 compute-1 nova_compute[185910]: 2026-02-16 13:22:54.750 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:56 compute-1 nova_compute[185910]: 2026-02-16 13:22:56.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:56 compute-1 nova_compute[185910]: 2026-02-16 13:22:56.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:22:56 compute-1 sshd-session[207361]: Connection closed by authenticating user root 146.190.226.24 port 35322 [preauth]
Feb 16 13:22:57 compute-1 nova_compute[185910]: 2026-02-16 13:22:57.769 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:22:58 compute-1 nova_compute[185910]: 2026-02-16 13:22:58.674 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:22:58 compute-1 podman[207363]: 2026-02-16 13:22:58.968726274 +0000 UTC m=+0.104127118 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 16 13:22:59 compute-1 nova_compute[185910]: 2026-02-16 13:22:59.753 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:00 compute-1 nova_compute[185910]: 2026-02-16 13:23:00.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:01 compute-1 nova_compute[185910]: 2026-02-16 13:23:01.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:02 compute-1 nova_compute[185910]: 2026-02-16 13:23:02.771 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:03.330 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:03.330 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:03.331 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:03 compute-1 nova_compute[185910]: 2026-02-16 13:23:03.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:03 compute-1 nova_compute[185910]: 2026-02-16 13:23:03.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:03 compute-1 nova_compute[185910]: 2026-02-16 13:23:03.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:03 compute-1 nova_compute[185910]: 2026-02-16 13:23:03.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:03 compute-1 nova_compute[185910]: 2026-02-16 13:23:03.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:23:03 compute-1 nova_compute[185910]: 2026-02-16 13:23:03.660 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:23:03 compute-1 nova_compute[185910]: 2026-02-16 13:23:03.661 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:04 compute-1 nova_compute[185910]: 2026-02-16 13:23:04.718 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:04 compute-1 nova_compute[185910]: 2026-02-16 13:23:04.718 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:23:04 compute-1 nova_compute[185910]: 2026-02-16 13:23:04.755 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:05 compute-1 podman[195236]: time="2026-02-16T13:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:23:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17243 "" "Go-http-client/1.1"
Feb 16 13:23:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2631 "" "Go-http-client/1.1"
Feb 16 13:23:06 compute-1 nova_compute[185910]: 2026-02-16 13:23:06.263 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:23:06 compute-1 nova_compute[185910]: 2026-02-16 13:23:06.264 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:23:06 compute-1 nova_compute[185910]: 2026-02-16 13:23:06.264 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:23:06 compute-1 podman[207390]: 2026-02-16 13:23:06.905024128 +0000 UTC m=+0.050775835 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:23:07 compute-1 nova_compute[185910]: 2026-02-16 13:23:07.800 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:09 compute-1 nova_compute[185910]: 2026-02-16 13:23:09.758 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.268 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Updating instance_info_cache with network_info: [{"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.329 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-5021a07d-59d2-49c7-b92f-0c25c5dc1222" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.330 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.330 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.330 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.331 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.387 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.388 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.388 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.388 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.584 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.645 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.647 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.720 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.726 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.772 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.773 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.827 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.832 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.876 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.877 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:11 compute-1 nova_compute[185910]: 2026-02-16 13:23:11.920 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.057 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.058 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5364MB free_disk=73.14106750488281GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.059 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.059 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.279 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 5021a07d-59d2-49c7-b92f-0c25c5dc1222 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.279 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 934dfad2-33a3-44dd-82c8-0b913e89cb8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.280 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 070628d7-dd99-487b-be76-d66c0d82ebc3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.280 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.280 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.622 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.643 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.702 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.703 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:12 compute-1 sshd-session[207418]: Invalid user postgres from 188.166.42.159 port 35746
Feb 16 13:23:12 compute-1 nova_compute[185910]: 2026-02-16 13:23:12.803 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:13 compute-1 sshd-session[207418]: Connection closed by invalid user postgres 188.166.42.159 port 35746 [preauth]
Feb 16 13:23:13 compute-1 nova_compute[185910]: 2026-02-16 13:23:13.611 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:14 compute-1 nova_compute[185910]: 2026-02-16 13:23:14.799 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:17 compute-1 nova_compute[185910]: 2026-02-16 13:23:17.805 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:19 compute-1 openstack_network_exporter[198096]: ERROR   13:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:23:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:23:19 compute-1 openstack_network_exporter[198096]: ERROR   13:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:23:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:23:19 compute-1 nova_compute[185910]: 2026-02-16 13:23:19.802 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:22 compute-1 nova_compute[185910]: 2026-02-16 13:23:22.808 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:24 compute-1 nova_compute[185910]: 2026-02-16 13:23:24.804 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:24 compute-1 podman[207437]: 2026-02-16 13:23:24.917466296 +0000 UTC m=+0.055609962 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:23:24 compute-1 podman[207436]: 2026-02-16 13:23:24.917790924 +0000 UTC m=+0.056025763 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 13:23:27 compute-1 nova_compute[185910]: 2026-02-16 13:23:27.810 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:29 compute-1 nova_compute[185910]: 2026-02-16 13:23:29.807 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:29 compute-1 podman[207476]: 2026-02-16 13:23:29.939122998 +0000 UTC m=+0.079380566 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:23:32 compute-1 nova_compute[185910]: 2026-02-16 13:23:32.812 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:34 compute-1 nova_compute[185910]: 2026-02-16 13:23:34.810 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:35 compute-1 podman[195236]: time="2026-02-16T13:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:23:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17243 "" "Go-http-client/1.1"
Feb 16 13:23:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2628 "" "Go-http-client/1.1"
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.896 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.952 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Triggering sync for uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.953 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Triggering sync for uuid 5021a07d-59d2-49c7-b92f-0c25c5dc1222 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.953 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Triggering sync for uuid 070628d7-dd99-487b-be76-d66c0d82ebc3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.954 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.954 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.954 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.955 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.955 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "070628d7-dd99-487b-be76-d66c0d82ebc3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.955 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:36 compute-1 nova_compute[185910]: 2026-02-16 13:23:36.961 185914 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Creating tmpfile /var/lib/nova/instances/tmp2w75m052 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:23:37 compute-1 nova_compute[185910]: 2026-02-16 13:23:37.022 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:37 compute-1 nova_compute[185910]: 2026-02-16 13:23:37.031 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:37 compute-1 nova_compute[185910]: 2026-02-16 13:23:37.041 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:37 compute-1 nova_compute[185910]: 2026-02-16 13:23:37.200 185914 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2w75m052',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:23:37 compute-1 nova_compute[185910]: 2026-02-16 13:23:37.865 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:37 compute-1 podman[207517]: 2026-02-16 13:23:37.932107096 +0000 UTC m=+0.046825918 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:23:38 compute-1 ovn_controller[96285]: 2026-02-16T13:23:38Z|00040|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Feb 16 13:23:39 compute-1 nova_compute[185910]: 2026-02-16 13:23:39.042 185914 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2w75m052',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b21f8b55-68d7-4cd7-beed-2d61f932f84e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:23:39 compute-1 nova_compute[185910]: 2026-02-16 13:23:39.094 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:23:39 compute-1 nova_compute[185910]: 2026-02-16 13:23:39.094 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:23:39 compute-1 nova_compute[185910]: 2026-02-16 13:23:39.094 185914 DEBUG nova.network.neutron [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:23:39 compute-1 nova_compute[185910]: 2026-02-16 13:23:39.813 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:42 compute-1 nova_compute[185910]: 2026-02-16 13:23:42.868 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.207 185914 DEBUG nova.network.neutron [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updating instance_info_cache with network_info: [{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.233 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.237 185914 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2w75m052',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b21f8b55-68d7-4cd7-beed-2d61f932f84e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.238 185914 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Creating instance directory: /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.239 185914 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Creating disk.info with the contents: {'/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk': 'qcow2', '/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.240 185914 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.241 185914 DEBUG nova.objects.instance [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid b21f8b55-68d7-4cd7-beed-2d61f932f84e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.291 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.342 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.344 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.344 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.359 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.409 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.410 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.445 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.446 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.447 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.498 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.499 185914 DEBUG nova.virt.disk.api [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.500 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.551 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.553 185914 DEBUG nova.virt.disk.api [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.554 185914 DEBUG nova.objects.instance [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid b21f8b55-68d7-4cd7-beed-2d61f932f84e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.575 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.595 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config 485376" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.596 185914 DEBUG nova.virt.libvirt.volume.remotefs [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config to /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.597 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:23:44 compute-1 nova_compute[185910]: 2026-02-16 13:23:44.815 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.015 185914 DEBUG oslo_concurrency.processutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk.config /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.016 185914 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.018 185914 DEBUG nova.virt.libvirt.vif [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:21:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2049385443',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2049385443',id=3,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:21:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-h5dq1f8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:21:47Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=b21f8b55-68d7-4cd7-beed-2d61f932f84e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.019 185914 DEBUG nova.network.os_vif_util [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.020 185914 DEBUG nova.network.os_vif_util [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.021 185914 DEBUG os_vif [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.022 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.022 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.023 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.026 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.027 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bdc1813-a8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.027 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3bdc1813-a8, col_values=(('external_ids', {'iface-id': '3bdc1813-a8d3-43b8-805c-95acd138d9d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:da:08', 'vm-uuid': 'b21f8b55-68d7-4cd7-beed-2d61f932f84e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.029 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:45 compute-1 NetworkManager[56388]: <info>  [1771248225.0304] manager: (tap3bdc1813-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.032 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.036 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.038 185914 INFO os_vif [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8')
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.039 185914 DEBUG nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:23:45 compute-1 nova_compute[185910]: 2026-02-16 13:23:45.039 185914 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2w75m052',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b21f8b55-68d7-4cd7-beed-2d61f932f84e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:23:47 compute-1 nova_compute[185910]: 2026-02-16 13:23:47.870 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:48 compute-1 nova_compute[185910]: 2026-02-16 13:23:48.828 185914 DEBUG nova.network.neutron [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Port 3bdc1813-a8d3-43b8-805c-95acd138d9d6 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:23:48 compute-1 nova_compute[185910]: 2026-02-16 13:23:48.830 185914 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=71680,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp2w75m052',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b21f8b55-68d7-4cd7-beed-2d61f932f84e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:23:49 compute-1 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:23:49 compute-1 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:23:49 compute-1 kernel: tap3bdc1813-a8: entered promiscuous mode
Feb 16 13:23:49 compute-1 NetworkManager[56388]: <info>  [1771248229.2144] manager: (tap3bdc1813-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Feb 16 13:23:49 compute-1 ovn_controller[96285]: 2026-02-16T13:23:49Z|00041|binding|INFO|Claiming lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 for this additional chassis.
Feb 16 13:23:49 compute-1 ovn_controller[96285]: 2026-02-16T13:23:49Z|00042|binding|INFO|3bdc1813-a8d3-43b8-805c-95acd138d9d6: Claiming fa:16:3e:8a:da:08 10.100.0.4
Feb 16 13:23:49 compute-1 nova_compute[185910]: 2026-02-16 13:23:49.216 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:49 compute-1 ovn_controller[96285]: 2026-02-16T13:23:49Z|00043|binding|INFO|Setting lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 ovn-installed in OVS
Feb 16 13:23:49 compute-1 nova_compute[185910]: 2026-02-16 13:23:49.223 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:49 compute-1 nova_compute[185910]: 2026-02-16 13:23:49.224 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:49 compute-1 nova_compute[185910]: 2026-02-16 13:23:49.229 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:49 compute-1 systemd-machined[155419]: New machine qemu-4-instance-00000003.
Feb 16 13:23:49 compute-1 systemd-udevd[207599]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:23:49 compute-1 NetworkManager[56388]: <info>  [1771248229.2616] device (tap3bdc1813-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:23:49 compute-1 NetworkManager[56388]: <info>  [1771248229.2629] device (tap3bdc1813-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:23:49 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000003.
Feb 16 13:23:49 compute-1 openstack_network_exporter[198096]: ERROR   13:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:23:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:23:49 compute-1 openstack_network_exporter[198096]: ERROR   13:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:23:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:23:50 compute-1 nova_compute[185910]: 2026-02-16 13:23:50.030 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:50 compute-1 nova_compute[185910]: 2026-02-16 13:23:50.060 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248230.0591927, b21f8b55-68d7-4cd7-beed-2d61f932f84e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:23:50 compute-1 nova_compute[185910]: 2026-02-16 13:23:50.060 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] VM Started (Lifecycle Event)
Feb 16 13:23:50 compute-1 nova_compute[185910]: 2026-02-16 13:23:50.108 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:23:52 compute-1 nova_compute[185910]: 2026-02-16 13:23:52.259 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248232.2594209, b21f8b55-68d7-4cd7-beed-2d61f932f84e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:23:52 compute-1 nova_compute[185910]: 2026-02-16 13:23:52.260 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] VM Resumed (Lifecycle Event)
Feb 16 13:23:52 compute-1 nova_compute[185910]: 2026-02-16 13:23:52.291 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:23:52 compute-1 nova_compute[185910]: 2026-02-16 13:23:52.296 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:23:52 compute-1 nova_compute[185910]: 2026-02-16 13:23:52.370 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Feb 16 13:23:52 compute-1 nova_compute[185910]: 2026-02-16 13:23:52.872 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:55 compute-1 nova_compute[185910]: 2026-02-16 13:23:55.033 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.690 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:23:55 compute-1 nova_compute[185910]: 2026-02-16 13:23:55.690 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.693 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:23:55 compute-1 ovn_controller[96285]: 2026-02-16T13:23:55Z|00044|binding|INFO|Claiming lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 for this chassis.
Feb 16 13:23:55 compute-1 ovn_controller[96285]: 2026-02-16T13:23:55Z|00045|binding|INFO|3bdc1813-a8d3-43b8-805c-95acd138d9d6: Claiming fa:16:3e:8a:da:08 10.100.0.4
Feb 16 13:23:55 compute-1 ovn_controller[96285]: 2026-02-16T13:23:55Z|00046|binding|INFO|Setting lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 up in Southbound
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.803 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:da:08 10.100.0.4'], port_security=['fa:16:3e:8a:da:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b21f8b55-68d7-4cd7-beed-2d61f932f84e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '11', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=3bdc1813-a8d3-43b8-805c-95acd138d9d6) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.804 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdc1813-a8d3-43b8-805c-95acd138d9d6 in datapath a6199784-1742-41a7-9152-bb54abb7bef1 bound to our chassis
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.806 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.821 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[d49a1a86-ae70-48aa-9b7c-a0a79630fdbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.850 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[926b892f-69f2-4c3f-9ff9-2e4af2aae6f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.854 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[6d73fc65-41e4-4b92-a32e-4b7060974a2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.876 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[c37897ba-190f-46c9-9060-8f5492f5e663]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.890 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c984652c-e61c-479d-afd5-6c9ba94cf389]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 17, 'tx_packets': 9, 'rx_bytes': 994, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 17, 'tx_packets': 9, 'rx_bytes': 994, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415002, 'reachable_time': 21372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207647, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.906 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bac0db-e78d-4259-a2e0-880b7a1e439d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415011, 'tstamp': 415011}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207654, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415012, 'tstamp': 415012}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207654, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.908 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:55 compute-1 nova_compute[185910]: 2026-02-16 13:23:55.910 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.911 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.911 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.912 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:23:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:23:55.912 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:23:56 compute-1 podman[207634]: 2026-02-16 13:23:56.138236413 +0000 UTC m=+0.265375509 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:23:56 compute-1 podman[207633]: 2026-02-16 13:23:56.154144846 +0000 UTC m=+0.278785156 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 16 13:23:56 compute-1 nova_compute[185910]: 2026-02-16 13:23:56.626 185914 INFO nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Post operation of migration started
Feb 16 13:23:57 compute-1 nova_compute[185910]: 2026-02-16 13:23:57.490 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:23:57 compute-1 nova_compute[185910]: 2026-02-16 13:23:57.491 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:23:57 compute-1 nova_compute[185910]: 2026-02-16 13:23:57.491 185914 DEBUG nova.network.neutron [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:23:57 compute-1 nova_compute[185910]: 2026-02-16 13:23:57.875 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:23:58 compute-1 nova_compute[185910]: 2026-02-16 13:23:58.690 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:00 compute-1 nova_compute[185910]: 2026-02-16 13:24:00.035 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:00 compute-1 podman[207680]: 2026-02-16 13:24:00.145999022 +0000 UTC m=+0.079888488 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:24:00 compute-1 nova_compute[185910]: 2026-02-16 13:24:00.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:02 compute-1 nova_compute[185910]: 2026-02-16 13:24:02.612 185914 DEBUG nova.network.neutron [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updating instance_info_cache with network_info: [{"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:24:02 compute-1 nova_compute[185910]: 2026-02-16 13:24:02.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:02 compute-1 nova_compute[185910]: 2026-02-16 13:24:02.641 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-b21f8b55-68d7-4cd7-beed-2d61f932f84e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:24:02 compute-1 nova_compute[185910]: 2026-02-16 13:24:02.696 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:02 compute-1 nova_compute[185910]: 2026-02-16 13:24:02.696 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:02 compute-1 nova_compute[185910]: 2026-02-16 13:24:02.697 185914 DEBUG oslo_concurrency.lockutils [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:02 compute-1 nova_compute[185910]: 2026-02-16 13:24:02.702 185914 INFO nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:24:02 compute-1 virtqemud[185025]: Domain id=4 name='instance-00000003' uuid=b21f8b55-68d7-4cd7-beed-2d61f932f84e is tainted: custom-monitor
Feb 16 13:24:02 compute-1 nova_compute[185910]: 2026-02-16 13:24:02.877 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:03.330 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:03.331 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:03.332 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:03 compute-1 nova_compute[185910]: 2026-02-16 13:24:03.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:03 compute-1 nova_compute[185910]: 2026-02-16 13:24:03.709 185914 INFO nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:24:04 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:04.696 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:04 compute-1 nova_compute[185910]: 2026-02-16 13:24:04.714 185914 INFO nova.virt.libvirt.driver [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:24:04 compute-1 nova_compute[185910]: 2026-02-16 13:24:04.718 185914 DEBUG nova.compute.manager [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:24:04 compute-1 nova_compute[185910]: 2026-02-16 13:24:04.745 185914 DEBUG nova.objects.instance [None req-307919dd-13fc-4b5d-9f47-6d3683aea8c2 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:24:05 compute-1 nova_compute[185910]: 2026-02-16 13:24:05.038 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:05 compute-1 nova_compute[185910]: 2026-02-16 13:24:05.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:05 compute-1 nova_compute[185910]: 2026-02-16 13:24:05.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:05 compute-1 nova_compute[185910]: 2026-02-16 13:24:05.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:05 compute-1 podman[195236]: time="2026-02-16T13:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:24:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17243 "" "Go-http-client/1.1"
Feb 16 13:24:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Feb 16 13:24:05 compute-1 nova_compute[185910]: 2026-02-16 13:24:05.694 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:05 compute-1 nova_compute[185910]: 2026-02-16 13:24:05.694 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:05 compute-1 nova_compute[185910]: 2026-02-16 13:24:05.694 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:05 compute-1 nova_compute[185910]: 2026-02-16 13:24:05.695 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:24:05 compute-1 sshd-session[207707]: Connection closed by authenticating user root 146.190.226.24 port 34514 [preauth]
Feb 16 13:24:05 compute-1 nova_compute[185910]: 2026-02-16 13:24:05.869 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.004 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.006 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.064 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.070 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.127 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.128 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.177 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.183 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.233 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.235 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.295 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.303 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.349 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.350 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.407 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.548 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.550 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5216MB free_disk=73.11233520507812GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.550 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.551 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.799 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Applying migration context for instance b21f8b55-68d7-4cd7-beed-2d61f932f84e as it has an incoming, in-progress migration 735afe49-654b-4958-aac0-48e243770fe0. Migration status is running _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.799 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:24:06 compute-1 nova_compute[185910]: 2026-02-16 13:24:06.954 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.140 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 070628d7-dd99-487b-be76-d66c0d82ebc3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.141 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 5021a07d-59d2-49c7-b92f-0c25c5dc1222 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.141 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 934dfad2-33a3-44dd-82c8-0b913e89cb8e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.141 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance b21f8b55-68d7-4cd7-beed-2d61f932f84e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.141 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.141 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=1088MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.395 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.423 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.458 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.458 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.908s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:07 compute-1 nova_compute[185910]: 2026-02-16 13:24:07.918 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:08 compute-1 sshd-session[207734]: Invalid user oracle from 188.166.42.159 port 60814
Feb 16 13:24:08 compute-1 nova_compute[185910]: 2026-02-16 13:24:08.459 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:08 compute-1 nova_compute[185910]: 2026-02-16 13:24:08.460 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:24:08 compute-1 nova_compute[185910]: 2026-02-16 13:24:08.460 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:24:08 compute-1 podman[207736]: 2026-02-16 13:24:08.521857676 +0000 UTC m=+0.093663035 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:24:08 compute-1 sshd-session[207734]: Connection closed by invalid user oracle 188.166.42.159 port 60814 [preauth]
Feb 16 13:24:09 compute-1 nova_compute[185910]: 2026-02-16 13:24:09.102 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:24:09 compute-1 nova_compute[185910]: 2026-02-16 13:24:09.102 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:24:09 compute-1 nova_compute[185910]: 2026-02-16 13:24:09.102 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:24:09 compute-1 nova_compute[185910]: 2026-02-16 13:24:09.102 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:24:09 compute-1 sshd-session[207747]: Invalid user sol from 2.57.122.210 port 44472
Feb 16 13:24:09 compute-1 sshd-session[207747]: Connection closed by invalid user sol 2.57.122.210 port 44472 [preauth]
Feb 16 13:24:10 compute-1 nova_compute[185910]: 2026-02-16 13:24:10.039 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:12 compute-1 nova_compute[185910]: 2026-02-16 13:24:12.545 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [{"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:24:12 compute-1 nova_compute[185910]: 2026-02-16 13:24:12.572 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-934dfad2-33a3-44dd-82c8-0b913e89cb8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:24:12 compute-1 nova_compute[185910]: 2026-02-16 13:24:12.573 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:24:12 compute-1 nova_compute[185910]: 2026-02-16 13:24:12.573 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:24:12 compute-1 nova_compute[185910]: 2026-02-16 13:24:12.573 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:24:12 compute-1 nova_compute[185910]: 2026-02-16 13:24:12.920 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:15 compute-1 nova_compute[185910]: 2026-02-16 13:24:15.042 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:17 compute-1 nova_compute[185910]: 2026-02-16 13:24:17.922 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:19 compute-1 openstack_network_exporter[198096]: ERROR   13:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:24:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:24:19 compute-1 openstack_network_exporter[198096]: ERROR   13:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:24:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.044 185914 DEBUG oslo_concurrency.lockutils [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "070628d7-dd99-487b-be76-d66c0d82ebc3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.044 185914 DEBUG oslo_concurrency.lockutils [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.045 185914 DEBUG oslo_concurrency.lockutils [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.045 185914 DEBUG oslo_concurrency.lockutils [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.045 185914 DEBUG oslo_concurrency.lockutils [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.046 185914 INFO nova.compute.manager [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Terminating instance
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.047 185914 DEBUG nova.compute.manager [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.048 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:20 compute-1 kernel: tapff2861e4-b2 (unregistering): left promiscuous mode
Feb 16 13:24:20 compute-1 NetworkManager[56388]: <info>  [1771248260.0791] device (tapff2861e4-b2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:24:20 compute-1 ovn_controller[96285]: 2026-02-16T13:24:20Z|00047|binding|INFO|Releasing lport ff2861e4-b2b3-4f21-8ca5-be850cbd522e from this chassis (sb_readonly=0)
Feb 16 13:24:20 compute-1 ovn_controller[96285]: 2026-02-16T13:24:20Z|00048|binding|INFO|Setting lport ff2861e4-b2b3-4f21-8ca5-be850cbd522e down in Southbound
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.086 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:20 compute-1 ovn_controller[96285]: 2026-02-16T13:24:20Z|00049|binding|INFO|Removing iface tapff2861e4-b2 ovn-installed in OVS
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.091 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.112 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:b8:54 10.100.0.14'], port_security=['fa:16:3e:ec:b8:54 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '070628d7-dd99-487b-be76-d66c0d82ebc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '4', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=ff2861e4-b2b3-4f21-8ca5-be850cbd522e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.114 105573 INFO neutron.agent.ovn.metadata.agent [-] Port ff2861e4-b2b3-4f21-8ca5-be850cbd522e in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.116 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.131 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1d3673-9659-4947-a0d5-480a03ffe772]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:20 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Feb 16 13:24:20 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 17.590s CPU time.
Feb 16 13:24:20 compute-1 systemd-machined[155419]: Machine qemu-3-instance-00000004 terminated.
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.159 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[5e456d43-f12e-4d96-8923-9502795714f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.162 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[d130ae5b-f561-450f-9f9c-9241629d1231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.180 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[a98ca0e7-d81b-40d2-9fd4-bca5439aaf81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.194 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a81e6736-ff91-4cd9-817b-5cc323b62d53]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 11, 'rx_bytes': 1624, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 11, 'rx_bytes': 1624, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415002, 'reachable_time': 42363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207776, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.206 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[b95811f2-f868-41e7-ba16-2073bc28ac43]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415011, 'tstamp': 415011}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207777, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415012, 'tstamp': 415012}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207777, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.208 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.210 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.215 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.215 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.215 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.215 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:20 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:20.216 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.269 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.273 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.307 185914 INFO nova.virt.libvirt.driver [-] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Instance destroyed successfully.
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.308 185914 DEBUG nova.objects.instance [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'resources' on Instance uuid 070628d7-dd99-487b-be76-d66c0d82ebc3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.345 185914 DEBUG nova.virt.libvirt.vif [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:21:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-337356765',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-337356765',id=4,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:22:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-5we1w2kg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:22:20Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=070628d7-dd99-487b-be76-d66c0d82ebc3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.346 185914 DEBUG nova.network.os_vif_util [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "address": "fa:16:3e:ec:b8:54", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapff2861e4-b2", "ovs_interfaceid": "ff2861e4-b2b3-4f21-8ca5-be850cbd522e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.347 185914 DEBUG nova.network.os_vif_util [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:b8:54,bridge_name='br-int',has_traffic_filtering=True,id=ff2861e4-b2b3-4f21-8ca5-be850cbd522e,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2861e4-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.347 185914 DEBUG os_vif [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:b8:54,bridge_name='br-int',has_traffic_filtering=True,id=ff2861e4-b2b3-4f21-8ca5-be850cbd522e,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2861e4-b2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.349 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.349 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff2861e4-b2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.375 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.378 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.383 185914 INFO os_vif [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:b8:54,bridge_name='br-int',has_traffic_filtering=True,id=ff2861e4-b2b3-4f21-8ca5-be850cbd522e,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapff2861e4-b2')
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.384 185914 INFO nova.virt.libvirt.driver [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Deleting instance files /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3_del
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.384 185914 INFO nova.virt.libvirt.driver [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Deletion of /var/lib/nova/instances/070628d7-dd99-487b-be76-d66c0d82ebc3_del complete
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.496 185914 DEBUG nova.virt.libvirt.host [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.497 185914 INFO nova.virt.libvirt.host [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] UEFI support detected
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.498 185914 INFO nova.compute.manager [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Took 0.45 seconds to destroy the instance on the hypervisor.
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.499 185914 DEBUG oslo.service.loopingcall [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.499 185914 DEBUG nova.compute.manager [-] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:24:20 compute-1 nova_compute[185910]: 2026-02-16 13:24:20.499 185914 DEBUG nova.network.neutron [-] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:24:21 compute-1 nova_compute[185910]: 2026-02-16 13:24:21.761 185914 DEBUG nova.compute.manager [req-4023cb11-77e2-4dd4-bca6-8c8879a070b1 req-492fd295-b888-4304-80db-02258f1cd745 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Received event network-vif-unplugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:21 compute-1 nova_compute[185910]: 2026-02-16 13:24:21.762 185914 DEBUG oslo_concurrency.lockutils [req-4023cb11-77e2-4dd4-bca6-8c8879a070b1 req-492fd295-b888-4304-80db-02258f1cd745 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:21 compute-1 nova_compute[185910]: 2026-02-16 13:24:21.762 185914 DEBUG oslo_concurrency.lockutils [req-4023cb11-77e2-4dd4-bca6-8c8879a070b1 req-492fd295-b888-4304-80db-02258f1cd745 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:21 compute-1 nova_compute[185910]: 2026-02-16 13:24:21.762 185914 DEBUG oslo_concurrency.lockutils [req-4023cb11-77e2-4dd4-bca6-8c8879a070b1 req-492fd295-b888-4304-80db-02258f1cd745 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:21 compute-1 nova_compute[185910]: 2026-02-16 13:24:21.763 185914 DEBUG nova.compute.manager [req-4023cb11-77e2-4dd4-bca6-8c8879a070b1 req-492fd295-b888-4304-80db-02258f1cd745 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] No waiting events found dispatching network-vif-unplugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:21 compute-1 nova_compute[185910]: 2026-02-16 13:24:21.763 185914 DEBUG nova.compute.manager [req-4023cb11-77e2-4dd4-bca6-8c8879a070b1 req-492fd295-b888-4304-80db-02258f1cd745 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Received event network-vif-unplugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:24:22 compute-1 nova_compute[185910]: 2026-02-16 13:24:22.924 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:23 compute-1 nova_compute[185910]: 2026-02-16 13:24:23.401 185914 DEBUG nova.network.neutron [-] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:24:23 compute-1 nova_compute[185910]: 2026-02-16 13:24:23.440 185914 INFO nova.compute.manager [-] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Took 2.94 seconds to deallocate network for instance.
Feb 16 13:24:23 compute-1 nova_compute[185910]: 2026-02-16 13:24:23.524 185914 DEBUG oslo_concurrency.lockutils [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:23 compute-1 nova_compute[185910]: 2026-02-16 13:24:23.525 185914 DEBUG oslo_concurrency.lockutils [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:23 compute-1 nova_compute[185910]: 2026-02-16 13:24:23.800 185914 DEBUG nova.compute.provider_tree [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:24:23 compute-1 nova_compute[185910]: 2026-02-16 13:24:23.826 185914 DEBUG nova.scheduler.client.report [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:24:23 compute-1 nova_compute[185910]: 2026-02-16 13:24:23.943 185914 DEBUG oslo_concurrency.lockutils [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:24 compute-1 nova_compute[185910]: 2026-02-16 13:24:24.035 185914 INFO nova.scheduler.client.report [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Deleted allocations for instance 070628d7-dd99-487b-be76-d66c0d82ebc3
Feb 16 13:24:24 compute-1 nova_compute[185910]: 2026-02-16 13:24:24.120 185914 DEBUG nova.compute.manager [req-096766a4-7580-4154-8a68-48564350ad32 req-09b84534-5bfd-4112-b512-a70f354b2099 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Received event network-vif-plugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:24 compute-1 nova_compute[185910]: 2026-02-16 13:24:24.121 185914 DEBUG oslo_concurrency.lockutils [req-096766a4-7580-4154-8a68-48564350ad32 req-09b84534-5bfd-4112-b512-a70f354b2099 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:24 compute-1 nova_compute[185910]: 2026-02-16 13:24:24.121 185914 DEBUG oslo_concurrency.lockutils [req-096766a4-7580-4154-8a68-48564350ad32 req-09b84534-5bfd-4112-b512-a70f354b2099 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:24 compute-1 nova_compute[185910]: 2026-02-16 13:24:24.121 185914 DEBUG oslo_concurrency.lockutils [req-096766a4-7580-4154-8a68-48564350ad32 req-09b84534-5bfd-4112-b512-a70f354b2099 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:24 compute-1 nova_compute[185910]: 2026-02-16 13:24:24.122 185914 DEBUG nova.compute.manager [req-096766a4-7580-4154-8a68-48564350ad32 req-09b84534-5bfd-4112-b512-a70f354b2099 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] No waiting events found dispatching network-vif-plugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:24 compute-1 nova_compute[185910]: 2026-02-16 13:24:24.122 185914 WARNING nova.compute.manager [req-096766a4-7580-4154-8a68-48564350ad32 req-09b84534-5bfd-4112-b512-a70f354b2099 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Received unexpected event network-vif-plugged-ff2861e4-b2b3-4f21-8ca5-be850cbd522e for instance with vm_state deleted and task_state None.
Feb 16 13:24:24 compute-1 nova_compute[185910]: 2026-02-16 13:24:24.122 185914 DEBUG nova.compute.manager [req-096766a4-7580-4154-8a68-48564350ad32 req-09b84534-5bfd-4112-b512-a70f354b2099 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Received event network-vif-deleted-ff2861e4-b2b3-4f21-8ca5-be850cbd522e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:24 compute-1 nova_compute[185910]: 2026-02-16 13:24:24.248 185914 DEBUG oslo_concurrency.lockutils [None req-cd7a1ba9-7646-4093-b786-7d30b8cd774e 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "070628d7-dd99-487b-be76-d66c0d82ebc3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.375 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.865 185914 DEBUG oslo_concurrency.lockutils [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.866 185914 DEBUG oslo_concurrency.lockutils [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.866 185914 DEBUG oslo_concurrency.lockutils [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.866 185914 DEBUG oslo_concurrency.lockutils [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.866 185914 DEBUG oslo_concurrency.lockutils [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.868 185914 INFO nova.compute.manager [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Terminating instance
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.868 185914 DEBUG nova.compute.manager [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:24:25 compute-1 kernel: tap3bdc1813-a8 (unregistering): left promiscuous mode
Feb 16 13:24:25 compute-1 NetworkManager[56388]: <info>  [1771248265.9015] device (tap3bdc1813-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:24:25 compute-1 ovn_controller[96285]: 2026-02-16T13:24:25Z|00050|binding|INFO|Releasing lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 from this chassis (sb_readonly=0)
Feb 16 13:24:25 compute-1 ovn_controller[96285]: 2026-02-16T13:24:25Z|00051|binding|INFO|Setting lport 3bdc1813-a8d3-43b8-805c-95acd138d9d6 down in Southbound
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.903 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:25 compute-1 ovn_controller[96285]: 2026-02-16T13:24:25Z|00052|binding|INFO|Removing iface tap3bdc1813-a8 ovn-installed in OVS
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.906 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:25 compute-1 nova_compute[185910]: 2026-02-16 13:24:25.912 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:25 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:25.921 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:da:08 10.100.0.4'], port_security=['fa:16:3e:8a:da:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b21f8b55-68d7-4cd7-beed-2d61f932f84e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '14', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=3bdc1813-a8d3-43b8-805c-95acd138d9d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:25 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:25.922 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 3bdc1813-a8d3-43b8-805c-95acd138d9d6 in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:24:25 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:25.923 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:24:25 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:25.935 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[04293601-89dc-4e23-9b77-1f005888de71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:25 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 16 13:24:25 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Consumed 3.852s CPU time.
Feb 16 13:24:25 compute-1 systemd-machined[155419]: Machine qemu-4-instance-00000003 terminated.
Feb 16 13:24:25 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:25.956 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[d25304dd-0016-4947-abdf-fc15f48f8a14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:25 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:25.960 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[10bdecc9-4ffe-4945-9982-9b0fcfda8ea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:25 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:25.979 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb06f03-2341-4495-ad75-21c1f6477979]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:25 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:25.994 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1b36f0a2-36c0-4432-b619-28705213db4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 13, 'rx_bytes': 1624, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 13, 'rx_bytes': 1624, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415002, 'reachable_time': 42363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207808, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:26.010 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5d24e737-c50d-4804-8496-bac1adf186db]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415011, 'tstamp': 415011}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207809, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415012, 'tstamp': 415012}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207809, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:26.012 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.013 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.016 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:26.017 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:26.017 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:24:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:26.017 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:26.017 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.122 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.147 185914 INFO nova.virt.libvirt.driver [-] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Instance destroyed successfully.
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.148 185914 DEBUG nova.objects.instance [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'resources' on Instance uuid b21f8b55-68d7-4cd7-beed-2d61f932f84e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.178 185914 DEBUG nova.virt.libvirt.vif [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:21:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-2049385443',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-2049385443',id=3,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:21:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-h5dq1f8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:24:04Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=b21f8b55-68d7-4cd7-beed-2d61f932f84e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.178 185914 DEBUG nova.network.os_vif_util [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "address": "fa:16:3e:8a:da:08", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bdc1813-a8", "ovs_interfaceid": "3bdc1813-a8d3-43b8-805c-95acd138d9d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.179 185914 DEBUG nova.network.os_vif_util [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.180 185914 DEBUG os_vif [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.182 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.182 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bdc1813-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.183 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.185 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.187 185914 INFO os_vif [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:da:08,bridge_name='br-int',has_traffic_filtering=True,id=3bdc1813-a8d3-43b8-805c-95acd138d9d6,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bdc1813-a8')
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.188 185914 INFO nova.virt.libvirt.driver [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Deleting instance files /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e_del
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.188 185914 INFO nova.virt.libvirt.driver [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Deletion of /var/lib/nova/instances/b21f8b55-68d7-4cd7-beed-2d61f932f84e_del complete
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.272 185914 INFO nova.compute.manager [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Took 0.40 seconds to destroy the instance on the hypervisor.
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.272 185914 DEBUG oslo.service.loopingcall [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.273 185914 DEBUG nova.compute.manager [-] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.273 185914 DEBUG nova.network.neutron [-] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.635 185914 DEBUG nova.compute.manager [req-30ba91b6-4da2-48c1-a383-71619f803863 req-0324dd9d-1447-4023-a0c8-b623a5c2f145 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.635 185914 DEBUG oslo_concurrency.lockutils [req-30ba91b6-4da2-48c1-a383-71619f803863 req-0324dd9d-1447-4023-a0c8-b623a5c2f145 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.636 185914 DEBUG oslo_concurrency.lockutils [req-30ba91b6-4da2-48c1-a383-71619f803863 req-0324dd9d-1447-4023-a0c8-b623a5c2f145 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.636 185914 DEBUG oslo_concurrency.lockutils [req-30ba91b6-4da2-48c1-a383-71619f803863 req-0324dd9d-1447-4023-a0c8-b623a5c2f145 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.636 185914 DEBUG nova.compute.manager [req-30ba91b6-4da2-48c1-a383-71619f803863 req-0324dd9d-1447-4023-a0c8-b623a5c2f145 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:26 compute-1 nova_compute[185910]: 2026-02-16 13:24:26.636 185914 DEBUG nova.compute.manager [req-30ba91b6-4da2-48c1-a383-71619f803863 req-0324dd9d-1447-4023-a0c8-b623a5c2f145 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-unplugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:24:26 compute-1 podman[207828]: 2026-02-16 13:24:26.925652106 +0000 UTC m=+0.049648673 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:24:26 compute-1 podman[207827]: 2026-02-16 13:24:26.936834614 +0000 UTC m=+0.063694868 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1770267347, name=ubi9/ubi-minimal)
Feb 16 13:24:27 compute-1 nova_compute[185910]: 2026-02-16 13:24:27.910 185914 DEBUG nova.network.neutron [-] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:24:27 compute-1 nova_compute[185910]: 2026-02-16 13:24:27.926 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:27 compute-1 nova_compute[185910]: 2026-02-16 13:24:27.937 185914 INFO nova.compute.manager [-] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Took 1.66 seconds to deallocate network for instance.
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.005 185914 DEBUG oslo_concurrency.lockutils [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.006 185914 DEBUG oslo_concurrency.lockutils [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.017 185914 DEBUG nova.compute.manager [req-24db713b-0bca-4e73-9c68-65676cc17b08 req-2d757f35-2dbb-4023-9374-f030451c990a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-deleted-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.121 185914 DEBUG nova.compute.provider_tree [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.141 185914 DEBUG nova.scheduler.client.report [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.174 185914 DEBUG oslo_concurrency.lockutils [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.202 185914 INFO nova.scheduler.client.report [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Deleted allocations for instance b21f8b55-68d7-4cd7-beed-2d61f932f84e
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.279 185914 DEBUG oslo_concurrency.lockutils [None req-893b346c-e922-4810-941d-27d892ec143c 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.752 185914 DEBUG nova.compute.manager [req-3b5aac96-cf9e-4bcf-8889-21cdb2fab1fa req-e6f70d9f-07f4-4eda-8e6d-efd8ae894acc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.753 185914 DEBUG oslo_concurrency.lockutils [req-3b5aac96-cf9e-4bcf-8889-21cdb2fab1fa req-e6f70d9f-07f4-4eda-8e6d-efd8ae894acc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.753 185914 DEBUG oslo_concurrency.lockutils [req-3b5aac96-cf9e-4bcf-8889-21cdb2fab1fa req-e6f70d9f-07f4-4eda-8e6d-efd8ae894acc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.753 185914 DEBUG oslo_concurrency.lockutils [req-3b5aac96-cf9e-4bcf-8889-21cdb2fab1fa req-e6f70d9f-07f4-4eda-8e6d-efd8ae894acc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b21f8b55-68d7-4cd7-beed-2d61f932f84e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.753 185914 DEBUG nova.compute.manager [req-3b5aac96-cf9e-4bcf-8889-21cdb2fab1fa req-e6f70d9f-07f4-4eda-8e6d-efd8ae894acc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] No waiting events found dispatching network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:28 compute-1 nova_compute[185910]: 2026-02-16 13:24:28.753 185914 WARNING nova.compute.manager [req-3b5aac96-cf9e-4bcf-8889-21cdb2fab1fa req-e6f70d9f-07f4-4eda-8e6d-efd8ae894acc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Received unexpected event network-vif-plugged-3bdc1813-a8d3-43b8-805c-95acd138d9d6 for instance with vm_state deleted and task_state None.
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.543 185914 DEBUG oslo_concurrency.lockutils [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.544 185914 DEBUG oslo_concurrency.lockutils [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.544 185914 DEBUG oslo_concurrency.lockutils [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.544 185914 DEBUG oslo_concurrency.lockutils [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.545 185914 DEBUG oslo_concurrency.lockutils [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.546 185914 INFO nova.compute.manager [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Terminating instance
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.547 185914 DEBUG nova.compute.manager [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:24:29 compute-1 kernel: tapec35d953-ee (unregistering): left promiscuous mode
Feb 16 13:24:29 compute-1 NetworkManager[56388]: <info>  [1771248269.5778] device (tapec35d953-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:24:29 compute-1 ovn_controller[96285]: 2026-02-16T13:24:29Z|00053|binding|INFO|Releasing lport ec35d953-ee21-47b6-bef7-1618058f79be from this chassis (sb_readonly=0)
Feb 16 13:24:29 compute-1 ovn_controller[96285]: 2026-02-16T13:24:29Z|00054|binding|INFO|Setting lport ec35d953-ee21-47b6-bef7-1618058f79be down in Southbound
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.581 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:29 compute-1 ovn_controller[96285]: 2026-02-16T13:24:29Z|00055|binding|INFO|Removing iface tapec35d953-ee ovn-installed in OVS
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.588 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.594 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:da:c0 10.100.0.10'], port_security=['fa:16:3e:be:da:c0 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '5021a07d-59d2-49c7-b92f-0c25c5dc1222', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '4', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=ec35d953-ee21-47b6-bef7-1618058f79be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.597 105573 INFO neutron.agent.ovn.metadata.agent [-] Port ec35d953-ee21-47b6-bef7-1618058f79be in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.598 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6199784-1742-41a7-9152-bb54abb7bef1
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.608 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d36141-734e-4765-9dfa-d986afeb9268]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:29 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Feb 16 13:24:29 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 21.679s CPU time.
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.629 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[9367c26b-4a0c-4b0f-ad49-5ff305a28441]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:29 compute-1 systemd-machined[155419]: Machine qemu-1-instance-00000002 terminated.
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.632 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3329f7-f9ff-471d-96ac-9a020ad77064]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.650 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[42a85994-b0c1-4e36-99fb-070fc45b83f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.664 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[abd73ee3-6e13-4bcb-904f-188755ee7997]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6199784-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b9:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 32, 'tx_packets': 15, 'rx_bytes': 1624, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 32, 'tx_packets': 15, 'rx_bytes': 1624, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415002, 'reachable_time': 42363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207878, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.675 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e84cea23-6e19-45c9-b384-2e3ac8a2da6d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415011, 'tstamp': 415011}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207879, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa6199784-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415012, 'tstamp': 415012}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207879, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.677 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.678 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.681 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.682 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6199784-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.682 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.682 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6199784-10, col_values=(('external_ids', {'iface-id': '3b5a298b-9fc2-4705-8faa-2b8cfb88937b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:29.683 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.800 185914 INFO nova.virt.libvirt.driver [-] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Instance destroyed successfully.
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.800 185914 DEBUG nova.objects.instance [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'resources' on Instance uuid 5021a07d-59d2-49c7-b92f-0c25c5dc1222 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.825 185914 DEBUG nova.virt.libvirt.vif [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:20:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-836105514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-836105514',id=2,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:21:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-90mzu2dc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:21:01Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=5021a07d-59d2-49c7-b92f-0c25c5dc1222,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.825 185914 DEBUG nova.network.os_vif_util [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "ec35d953-ee21-47b6-bef7-1618058f79be", "address": "fa:16:3e:be:da:c0", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec35d953-ee", "ovs_interfaceid": "ec35d953-ee21-47b6-bef7-1618058f79be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.826 185914 DEBUG nova.network.os_vif_util [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:be:da:c0,bridge_name='br-int',has_traffic_filtering=True,id=ec35d953-ee21-47b6-bef7-1618058f79be,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35d953-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.826 185914 DEBUG os_vif [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:da:c0,bridge_name='br-int',has_traffic_filtering=True,id=ec35d953-ee21-47b6-bef7-1618058f79be,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35d953-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.828 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.828 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec35d953-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.829 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.831 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.833 185914 INFO os_vif [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:da:c0,bridge_name='br-int',has_traffic_filtering=True,id=ec35d953-ee21-47b6-bef7-1618058f79be,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec35d953-ee')
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.834 185914 INFO nova.virt.libvirt.driver [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Deleting instance files /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222_del
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.834 185914 INFO nova.virt.libvirt.driver [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Deletion of /var/lib/nova/instances/5021a07d-59d2-49c7-b92f-0c25c5dc1222_del complete
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.903 185914 INFO nova.compute.manager [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.903 185914 DEBUG oslo.service.loopingcall [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.904 185914 DEBUG nova.compute.manager [-] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:24:29 compute-1 nova_compute[185910]: 2026-02-16 13:24:29.904 185914 DEBUG nova.network.neutron [-] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:24:30 compute-1 podman[207896]: 2026-02-16 13:24:30.932768988 +0000 UTC m=+0.071279719 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:24:30 compute-1 nova_compute[185910]: 2026-02-16 13:24:30.937 185914 DEBUG nova.compute.manager [req-54d8c960-6a92-4c6d-ae6c-ad902006b4bb req-2d5fa619-c468-4b63-9da1-fb2669b4e995 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Received event network-vif-unplugged-ec35d953-ee21-47b6-bef7-1618058f79be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:30 compute-1 nova_compute[185910]: 2026-02-16 13:24:30.937 185914 DEBUG oslo_concurrency.lockutils [req-54d8c960-6a92-4c6d-ae6c-ad902006b4bb req-2d5fa619-c468-4b63-9da1-fb2669b4e995 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:30 compute-1 nova_compute[185910]: 2026-02-16 13:24:30.937 185914 DEBUG oslo_concurrency.lockutils [req-54d8c960-6a92-4c6d-ae6c-ad902006b4bb req-2d5fa619-c468-4b63-9da1-fb2669b4e995 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:30 compute-1 nova_compute[185910]: 2026-02-16 13:24:30.937 185914 DEBUG oslo_concurrency.lockutils [req-54d8c960-6a92-4c6d-ae6c-ad902006b4bb req-2d5fa619-c468-4b63-9da1-fb2669b4e995 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:30 compute-1 nova_compute[185910]: 2026-02-16 13:24:30.938 185914 DEBUG nova.compute.manager [req-54d8c960-6a92-4c6d-ae6c-ad902006b4bb req-2d5fa619-c468-4b63-9da1-fb2669b4e995 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] No waiting events found dispatching network-vif-unplugged-ec35d953-ee21-47b6-bef7-1618058f79be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:30 compute-1 nova_compute[185910]: 2026-02-16 13:24:30.938 185914 DEBUG nova.compute.manager [req-54d8c960-6a92-4c6d-ae6c-ad902006b4bb req-2d5fa619-c468-4b63-9da1-fb2669b4e995 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Received event network-vif-unplugged-ec35d953-ee21-47b6-bef7-1618058f79be for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:24:30 compute-1 nova_compute[185910]: 2026-02-16 13:24:30.968 185914 DEBUG nova.network.neutron [-] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:24:30 compute-1 nova_compute[185910]: 2026-02-16 13:24:30.992 185914 INFO nova.compute.manager [-] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Took 1.09 seconds to deallocate network for instance.
Feb 16 13:24:31 compute-1 nova_compute[185910]: 2026-02-16 13:24:31.032 185914 DEBUG nova.compute.manager [req-aad2d7b1-21a2-4a5d-9915-9ec3e7e76a16 req-8be017de-5fe3-4de2-a8a8-888a3e0116fc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Received event network-vif-deleted-ec35d953-ee21-47b6-bef7-1618058f79be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:31 compute-1 nova_compute[185910]: 2026-02-16 13:24:31.066 185914 DEBUG oslo_concurrency.lockutils [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:31 compute-1 nova_compute[185910]: 2026-02-16 13:24:31.067 185914 DEBUG oslo_concurrency.lockutils [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:31 compute-1 nova_compute[185910]: 2026-02-16 13:24:31.153 185914 DEBUG nova.compute.provider_tree [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:24:31 compute-1 nova_compute[185910]: 2026-02-16 13:24:31.176 185914 DEBUG nova.scheduler.client.report [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:24:31 compute-1 nova_compute[185910]: 2026-02-16 13:24:31.208 185914 DEBUG oslo_concurrency.lockutils [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:31 compute-1 nova_compute[185910]: 2026-02-16 13:24:31.275 185914 INFO nova.scheduler.client.report [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Deleted allocations for instance 5021a07d-59d2-49c7-b92f-0c25c5dc1222
Feb 16 13:24:31 compute-1 nova_compute[185910]: 2026-02-16 13:24:31.366 185914 DEBUG oslo_concurrency.lockutils [None req-1a0d85fe-bd6c-4ed1-a274-2dd883321e71 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.821s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.346 185914 DEBUG oslo_concurrency.lockutils [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.347 185914 DEBUG oslo_concurrency.lockutils [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.347 185914 DEBUG oslo_concurrency.lockutils [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.347 185914 DEBUG oslo_concurrency.lockutils [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.348 185914 DEBUG oslo_concurrency.lockutils [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.349 185914 INFO nova.compute.manager [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Terminating instance
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.350 185914 DEBUG nova.compute.manager [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:24:32 compute-1 kernel: tapb0642d70-aa (unregistering): left promiscuous mode
Feb 16 13:24:32 compute-1 NetworkManager[56388]: <info>  [1771248272.3809] device (tapb0642d70-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.386 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-1 ovn_controller[96285]: 2026-02-16T13:24:32Z|00056|binding|INFO|Releasing lport b0642d70-aac9-4a19-b18b-6f6a914d307a from this chassis (sb_readonly=0)
Feb 16 13:24:32 compute-1 ovn_controller[96285]: 2026-02-16T13:24:32Z|00057|binding|INFO|Setting lport b0642d70-aac9-4a19-b18b-6f6a914d307a down in Southbound
Feb 16 13:24:32 compute-1 ovn_controller[96285]: 2026-02-16T13:24:32Z|00058|binding|INFO|Removing iface tapb0642d70-aa ovn-installed in OVS
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.388 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.395 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.404 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:7c:d9 10.100.0.6'], port_security=['fa:16:3e:b1:7c:d9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '934dfad2-33a3-44dd-82c8-0b913e89cb8e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6199784-1742-41a7-9152-bb54abb7bef1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e0321e3a614b62a46eef7fb2e737ff', 'neutron:revision_number': '8', 'neutron:security_group_ids': '22e3f3ae-6435-49f2-b1a3-ead6d5ff75b1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a8796a1-c459-4e68-a95d-23fef829aa8d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=b0642d70-aac9-4a19-b18b-6f6a914d307a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.405 105573 INFO neutron.agent.ovn.metadata.agent [-] Port b0642d70-aac9-4a19-b18b-6f6a914d307a in datapath a6199784-1742-41a7-9152-bb54abb7bef1 unbound from our chassis
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.406 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6199784-1742-41a7-9152-bb54abb7bef1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.407 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f1505a8d-18e2-468d-8f82-c5d8a55699dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.408 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 namespace which is not needed anymore
Feb 16 13:24:32 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Deactivated successfully.
Feb 16 13:24:32 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Consumed 20.353s CPU time.
Feb 16 13:24:32 compute-1 systemd-machined[155419]: Machine qemu-2-instance-00000001 terminated.
Feb 16 13:24:32 compute-1 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206793]: [NOTICE]   (206797) : haproxy version is 2.8.14-c23fe91
Feb 16 13:24:32 compute-1 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206793]: [NOTICE]   (206797) : path to executable is /usr/sbin/haproxy
Feb 16 13:24:32 compute-1 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206793]: [WARNING]  (206797) : Exiting Master process...
Feb 16 13:24:32 compute-1 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206793]: [ALERT]    (206797) : Current worker (206799) exited with code 143 (Terminated)
Feb 16 13:24:32 compute-1 neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1[206793]: [WARNING]  (206797) : All workers exited. Exiting... (0)
Feb 16 13:24:32 compute-1 systemd[1]: libpod-9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b.scope: Deactivated successfully.
Feb 16 13:24:32 compute-1 podman[207949]: 2026-02-16 13:24:32.546023485 +0000 UTC m=+0.054320938 container died 9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:24:32 compute-1 NetworkManager[56388]: <info>  [1771248272.5661] manager: (tapb0642d70-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Feb 16 13:24:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b-userdata-shm.mount: Deactivated successfully.
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.569 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-23d7042eadfbaad7b78f21ea23482de4d076e2b67c8702c3efa23c00b613fb06-merged.mount: Deactivated successfully.
Feb 16 13:24:32 compute-1 podman[207949]: 2026-02-16 13:24:32.58000887 +0000 UTC m=+0.088306323 container cleanup 9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:24:32 compute-1 systemd[1]: libpod-conmon-9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b.scope: Deactivated successfully.
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.598 185914 INFO nova.virt.libvirt.driver [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Instance destroyed successfully.
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.600 185914 DEBUG nova.objects.instance [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lazy-loading 'resources' on Instance uuid 934dfad2-33a3-44dd-82c8-0b913e89cb8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.624 185914 DEBUG nova.virt.libvirt.vif [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:20:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-727824786',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-727824786',id=1,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:21:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5e0321e3a614b62a46eef7fb2e737ff',ramdisk_id='',reservation_id='r-7k7vpckb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1504038973',owner_user_name='tempest-TestExecuteActionsViaActuator-1504038973-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:21:32Z,user_data=None,user_id='53b5045c5aaf4a7d8d84dce2ac4aa424',uuid=934dfad2-33a3-44dd-82c8-0b913e89cb8e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.624 185914 DEBUG nova.network.os_vif_util [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converting VIF {"id": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "address": "fa:16:3e:b1:7c:d9", "network": {"id": "a6199784-1742-41a7-9152-bb54abb7bef1", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-926379118-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5e0321e3a614b62a46eef7fb2e737ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0642d70-aa", "ovs_interfaceid": "b0642d70-aac9-4a19-b18b-6f6a914d307a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.625 185914 DEBUG nova.network.os_vif_util [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.625 185914 DEBUG os_vif [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.627 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.627 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb0642d70-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.629 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.631 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.634 185914 INFO os_vif [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:7c:d9,bridge_name='br-int',has_traffic_filtering=True,id=b0642d70-aac9-4a19-b18b-6f6a914d307a,network=Network(a6199784-1742-41a7-9152-bb54abb7bef1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0642d70-aa')
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.635 185914 INFO nova.virt.libvirt.driver [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Deleting instance files /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_del
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.638 185914 INFO nova.virt.libvirt.driver [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Deletion of /var/lib/nova/instances/934dfad2-33a3-44dd-82c8-0b913e89cb8e_del complete
Feb 16 13:24:32 compute-1 podman[207991]: 2026-02-16 13:24:32.645205826 +0000 UTC m=+0.045283607 container remove 9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.649 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5f44de-aa18-49a4-aca1-4224e00ee77b]: (4, ('Mon Feb 16 01:24:32 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 (9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b)\n9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b\nMon Feb 16 01:24:32 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 (9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b)\n9d5cba430ebdb7506868c1e638fc90ec14ffa2cebaac230c8259478db052724b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.650 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[63545425-f85a-44b0-b753-a165ad4736f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.651 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6199784-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.652 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-1 kernel: tapa6199784-10: left promiscuous mode
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.657 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.660 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[02794774-882a-4194-9e7b-7922ada708f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.676 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[98b3f79c-d3fc-462a-b6b2-7a26d7f8cd43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.678 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[b7cb15a3-ff4f-415d-9c50-f3672ba582ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.690 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e301a4c2-4ff5-41e9-81fc-bc3598319ee2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414992, 'reachable_time': 19153, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208010, 'error': None, 'target': 'ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.699 185914 INFO nova.compute.manager [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.699 185914 DEBUG oslo.service.loopingcall [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.699 185914 DEBUG nova.compute.manager [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.699 185914 DEBUG nova.network.neutron [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:24:32 compute-1 systemd[1]: run-netns-ovnmeta\x2da6199784\x2d1742\x2d41a7\x2d9152\x2dbb54abb7bef1.mount: Deactivated successfully.
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.699 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a6199784-1742-41a7-9152-bb54abb7bef1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:24:32 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:32.699 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[22024036-8848-42d0-b336-d4b308dc7f03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:24:32 compute-1 nova_compute[185910]: 2026-02-16 13:24:32.927 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:33 compute-1 nova_compute[185910]: 2026-02-16 13:24:33.116 185914 DEBUG nova.compute.manager [req-259f2dff-76bc-4ce2-9c3c-1a08cf042eac req-e4ae2d6c-adbb-442d-a409-bc46c805aee9 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Received event network-vif-plugged-ec35d953-ee21-47b6-bef7-1618058f79be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:33 compute-1 nova_compute[185910]: 2026-02-16 13:24:33.116 185914 DEBUG oslo_concurrency.lockutils [req-259f2dff-76bc-4ce2-9c3c-1a08cf042eac req-e4ae2d6c-adbb-442d-a409-bc46c805aee9 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:33 compute-1 nova_compute[185910]: 2026-02-16 13:24:33.116 185914 DEBUG oslo_concurrency.lockutils [req-259f2dff-76bc-4ce2-9c3c-1a08cf042eac req-e4ae2d6c-adbb-442d-a409-bc46c805aee9 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:33 compute-1 nova_compute[185910]: 2026-02-16 13:24:33.117 185914 DEBUG oslo_concurrency.lockutils [req-259f2dff-76bc-4ce2-9c3c-1a08cf042eac req-e4ae2d6c-adbb-442d-a409-bc46c805aee9 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5021a07d-59d2-49c7-b92f-0c25c5dc1222-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:33 compute-1 nova_compute[185910]: 2026-02-16 13:24:33.117 185914 DEBUG nova.compute.manager [req-259f2dff-76bc-4ce2-9c3c-1a08cf042eac req-e4ae2d6c-adbb-442d-a409-bc46c805aee9 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] No waiting events found dispatching network-vif-plugged-ec35d953-ee21-47b6-bef7-1618058f79be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:33 compute-1 nova_compute[185910]: 2026-02-16 13:24:33.117 185914 WARNING nova.compute.manager [req-259f2dff-76bc-4ce2-9c3c-1a08cf042eac req-e4ae2d6c-adbb-442d-a409-bc46c805aee9 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Received unexpected event network-vif-plugged-ec35d953-ee21-47b6-bef7-1618058f79be for instance with vm_state deleted and task_state None.
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.307 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248260.3061762, 070628d7-dd99-487b-be76-d66c0d82ebc3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.307 185914 INFO nova.compute.manager [-] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] VM Stopped (Lifecycle Event)
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.348 185914 DEBUG nova.compute.manager [None req-97d06fd3-8e26-44f0-afe8-c8d2ba5b3177 - - - - - -] [instance: 070628d7-dd99-487b-be76-d66c0d82ebc3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.477 185914 DEBUG nova.compute.manager [req-a98caffd-2a4d-4822-a331-8136373dd6ab req-eba131e9-593b-4110-8f1b-e8f6615da433 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-unplugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.477 185914 DEBUG oslo_concurrency.lockutils [req-a98caffd-2a4d-4822-a331-8136373dd6ab req-eba131e9-593b-4110-8f1b-e8f6615da433 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.477 185914 DEBUG oslo_concurrency.lockutils [req-a98caffd-2a4d-4822-a331-8136373dd6ab req-eba131e9-593b-4110-8f1b-e8f6615da433 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.477 185914 DEBUG oslo_concurrency.lockutils [req-a98caffd-2a4d-4822-a331-8136373dd6ab req-eba131e9-593b-4110-8f1b-e8f6615da433 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.478 185914 DEBUG nova.compute.manager [req-a98caffd-2a4d-4822-a331-8136373dd6ab req-eba131e9-593b-4110-8f1b-e8f6615da433 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-unplugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.478 185914 DEBUG nova.compute.manager [req-a98caffd-2a4d-4822-a331-8136373dd6ab req-eba131e9-593b-4110-8f1b-e8f6615da433 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-unplugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:24:35 compute-1 podman[195236]: time="2026-02-16T13:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:24:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:24:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.832 185914 DEBUG nova.network.neutron [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.879 185914 INFO nova.compute.manager [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Took 3.18 seconds to deallocate network for instance.
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.942 185914 DEBUG oslo_concurrency.lockutils [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:35 compute-1 nova_compute[185910]: 2026-02-16 13:24:35.943 185914 DEBUG oslo_concurrency.lockutils [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:36 compute-1 nova_compute[185910]: 2026-02-16 13:24:36.001 185914 DEBUG nova.compute.provider_tree [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:24:36 compute-1 nova_compute[185910]: 2026-02-16 13:24:36.021 185914 DEBUG nova.scheduler.client.report [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:24:36 compute-1 nova_compute[185910]: 2026-02-16 13:24:36.054 185914 DEBUG oslo_concurrency.lockutils [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:36 compute-1 nova_compute[185910]: 2026-02-16 13:24:36.091 185914 INFO nova.scheduler.client.report [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Deleted allocations for instance 934dfad2-33a3-44dd-82c8-0b913e89cb8e
Feb 16 13:24:36 compute-1 nova_compute[185910]: 2026-02-16 13:24:36.182 185914 DEBUG oslo_concurrency.lockutils [None req-c1e9c919-23ea-49ac-a195-c407e9e99c72 53b5045c5aaf4a7d8d84dce2ac4aa424 b5e0321e3a614b62a46eef7fb2e737ff - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:37 compute-1 nova_compute[185910]: 2026-02-16 13:24:37.631 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:37 compute-1 nova_compute[185910]: 2026-02-16 13:24:37.649 185914 DEBUG nova.compute.manager [req-401245af-fe12-4fb2-b745-6e2117288351 req-7c8a0586-bcd8-441a-89bd-b7034572d64b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-deleted-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:37 compute-1 nova_compute[185910]: 2026-02-16 13:24:37.650 185914 DEBUG nova.compute.manager [req-401245af-fe12-4fb2-b745-6e2117288351 req-7c8a0586-bcd8-441a-89bd-b7034572d64b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:24:37 compute-1 nova_compute[185910]: 2026-02-16 13:24:37.650 185914 DEBUG oslo_concurrency.lockutils [req-401245af-fe12-4fb2-b745-6e2117288351 req-7c8a0586-bcd8-441a-89bd-b7034572d64b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:24:37 compute-1 nova_compute[185910]: 2026-02-16 13:24:37.650 185914 DEBUG oslo_concurrency.lockutils [req-401245af-fe12-4fb2-b745-6e2117288351 req-7c8a0586-bcd8-441a-89bd-b7034572d64b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:24:37 compute-1 nova_compute[185910]: 2026-02-16 13:24:37.651 185914 DEBUG oslo_concurrency.lockutils [req-401245af-fe12-4fb2-b745-6e2117288351 req-7c8a0586-bcd8-441a-89bd-b7034572d64b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "934dfad2-33a3-44dd-82c8-0b913e89cb8e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:24:37 compute-1 nova_compute[185910]: 2026-02-16 13:24:37.651 185914 DEBUG nova.compute.manager [req-401245af-fe12-4fb2-b745-6e2117288351 req-7c8a0586-bcd8-441a-89bd-b7034572d64b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] No waiting events found dispatching network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:24:37 compute-1 nova_compute[185910]: 2026-02-16 13:24:37.651 185914 WARNING nova.compute.manager [req-401245af-fe12-4fb2-b745-6e2117288351 req-7c8a0586-bcd8-441a-89bd-b7034572d64b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Received unexpected event network-vif-plugged-b0642d70-aac9-4a19-b18b-6f6a914d307a for instance with vm_state deleted and task_state None.
Feb 16 13:24:37 compute-1 nova_compute[185910]: 2026-02-16 13:24:37.930 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:38 compute-1 podman[208013]: 2026-02-16 13:24:38.905185538 +0000 UTC m=+0.045400860 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:24:41 compute-1 nova_compute[185910]: 2026-02-16 13:24:41.147 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248266.1458733, b21f8b55-68d7-4cd7-beed-2d61f932f84e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:24:41 compute-1 nova_compute[185910]: 2026-02-16 13:24:41.147 185914 INFO nova.compute.manager [-] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] VM Stopped (Lifecycle Event)
Feb 16 13:24:41 compute-1 nova_compute[185910]: 2026-02-16 13:24:41.171 185914 DEBUG nova.compute.manager [None req-8509b7d7-54b7-48b4-be30-a1c10761c941 - - - - - -] [instance: b21f8b55-68d7-4cd7-beed-2d61f932f84e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:24:42 compute-1 nova_compute[185910]: 2026-02-16 13:24:42.634 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:42 compute-1 nova_compute[185910]: 2026-02-16 13:24:42.932 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:44 compute-1 nova_compute[185910]: 2026-02-16 13:24:44.798 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248269.7976522, 5021a07d-59d2-49c7-b92f-0c25c5dc1222 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:24:44 compute-1 nova_compute[185910]: 2026-02-16 13:24:44.799 185914 INFO nova.compute.manager [-] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] VM Stopped (Lifecycle Event)
Feb 16 13:24:44 compute-1 nova_compute[185910]: 2026-02-16 13:24:44.826 185914 DEBUG nova.compute.manager [None req-3438cf47-bd0f-4037-8e34-532fb7220105 - - - - - -] [instance: 5021a07d-59d2-49c7-b92f-0c25c5dc1222] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:24:47 compute-1 nova_compute[185910]: 2026-02-16 13:24:47.595 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248272.5934386, 934dfad2-33a3-44dd-82c8-0b913e89cb8e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:24:47 compute-1 nova_compute[185910]: 2026-02-16 13:24:47.595 185914 INFO nova.compute.manager [-] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] VM Stopped (Lifecycle Event)
Feb 16 13:24:47 compute-1 nova_compute[185910]: 2026-02-16 13:24:47.662 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:47 compute-1 nova_compute[185910]: 2026-02-16 13:24:47.834 185914 DEBUG nova.compute.manager [None req-7bc6c50d-31ae-45e0-bec2-b8dfbbf48c1c - - - - - -] [instance: 934dfad2-33a3-44dd-82c8-0b913e89cb8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:24:47 compute-1 nova_compute[185910]: 2026-02-16 13:24:47.934 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:48 compute-1 nova_compute[185910]: 2026-02-16 13:24:48.661 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:49 compute-1 openstack_network_exporter[198096]: ERROR   13:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:24:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:24:49 compute-1 openstack_network_exporter[198096]: ERROR   13:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:24:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:24:52 compute-1 nova_compute[185910]: 2026-02-16 13:24:52.665 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:52 compute-1 nova_compute[185910]: 2026-02-16 13:24:52.937 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:57 compute-1 nova_compute[185910]: 2026-02-16 13:24:57.703 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:57 compute-1 podman[208038]: 2026-02-16 13:24:57.924319328 +0000 UTC m=+0.068809404 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Feb 16 13:24:57 compute-1 podman[208037]: 2026-02-16 13:24:57.924328418 +0000 UTC m=+0.069939344 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1770267347, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Feb 16 13:24:57 compute-1 nova_compute[185910]: 2026-02-16 13:24:57.938 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:59 compute-1 nova_compute[185910]: 2026-02-16 13:24:59.408 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:24:59 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:59.408 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:24:59 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:24:59.409 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:24:59 compute-1 nova_compute[185910]: 2026-02-16 13:24:59.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:01 compute-1 podman[208076]: 2026-02-16 13:25:01.921574877 +0000 UTC m=+0.063459721 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:25:02 compute-1 nova_compute[185910]: 2026-02-16 13:25:02.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:02 compute-1 nova_compute[185910]: 2026-02-16 13:25:02.705 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:02 compute-1 nova_compute[185910]: 2026-02-16 13:25:02.940 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:03.331 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:03.332 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:03.332 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:03 compute-1 nova_compute[185910]: 2026-02-16 13:25:03.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:03 compute-1 nova_compute[185910]: 2026-02-16 13:25:03.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:04 compute-1 sshd-session[208104]: Invalid user user from 188.166.42.159 port 47022
Feb 16 13:25:04 compute-1 sshd-session[208104]: Connection closed by invalid user user 188.166.42.159 port 47022 [preauth]
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:05 compute-1 podman[195236]: time="2026-02-16T13:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:25:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:25:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.694 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.695 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.695 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.695 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.821 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.822 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5881MB free_disk=73.22798919677734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.822 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.822 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.986 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:25:05 compute-1 nova_compute[185910]: 2026-02-16 13:25:05.986 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:25:06 compute-1 nova_compute[185910]: 2026-02-16 13:25:06.034 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:25:06 compute-1 nova_compute[185910]: 2026-02-16 13:25:06.056 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:25:06 compute-1 nova_compute[185910]: 2026-02-16 13:25:06.084 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:25:06 compute-1 nova_compute[185910]: 2026-02-16 13:25:06.085 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:07 compute-1 nova_compute[185910]: 2026-02-16 13:25:07.080 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:07 compute-1 nova_compute[185910]: 2026-02-16 13:25:07.707 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:07 compute-1 nova_compute[185910]: 2026-02-16 13:25:07.942 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:08 compute-1 nova_compute[185910]: 2026-02-16 13:25:08.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:08 compute-1 nova_compute[185910]: 2026-02-16 13:25:08.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:25:08 compute-1 nova_compute[185910]: 2026-02-16 13:25:08.743 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:25:08 compute-1 nova_compute[185910]: 2026-02-16 13:25:08.744 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:08 compute-1 nova_compute[185910]: 2026-02-16 13:25:08.744 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:25:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:09.411 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:25:09 compute-1 podman[208107]: 2026-02-16 13:25:09.93167353 +0000 UTC m=+0.071700970 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:25:10 compute-1 nova_compute[185910]: 2026-02-16 13:25:10.739 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:25:10 compute-1 sshd-session[208132]: Connection closed by authenticating user root 146.190.226.24 port 52580 [preauth]
Feb 16 13:25:12 compute-1 nova_compute[185910]: 2026-02-16 13:25:12.710 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:12 compute-1 nova_compute[185910]: 2026-02-16 13:25:12.945 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:17 compute-1 nova_compute[185910]: 2026-02-16 13:25:17.713 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:17 compute-1 nova_compute[185910]: 2026-02-16 13:25:17.946 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:19 compute-1 openstack_network_exporter[198096]: ERROR   13:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:25:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:25:19 compute-1 openstack_network_exporter[198096]: ERROR   13:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:25:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:25:22 compute-1 nova_compute[185910]: 2026-02-16 13:25:22.715 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:22 compute-1 nova_compute[185910]: 2026-02-16 13:25:22.948 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:27 compute-1 nova_compute[185910]: 2026-02-16 13:25:27.718 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:27 compute-1 nova_compute[185910]: 2026-02-16 13:25:27.952 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:28 compute-1 podman[208134]: 2026-02-16 13:25:28.916466064 +0000 UTC m=+0.049895527 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1770267347, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, distribution-scope=public, name=ubi9/ubi-minimal, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container)
Feb 16 13:25:28 compute-1 podman[208135]: 2026-02-16 13:25:28.916468124 +0000 UTC m=+0.047092001 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:25:30 compute-1 ovn_controller[96285]: 2026-02-16T13:25:30Z|00059|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 13:25:32 compute-1 nova_compute[185910]: 2026-02-16 13:25:32.720 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:32 compute-1 podman[208173]: 2026-02-16 13:25:32.927924173 +0000 UTC m=+0.069761516 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:25:32 compute-1 nova_compute[185910]: 2026-02-16 13:25:32.993 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:35 compute-1 podman[195236]: time="2026-02-16T13:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:25:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:25:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Feb 16 13:25:37 compute-1 nova_compute[185910]: 2026-02-16 13:25:37.775 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:37 compute-1 nova_compute[185910]: 2026-02-16 13:25:37.995 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:40 compute-1 podman[208200]: 2026-02-16 13:25:40.902776466 +0000 UTC m=+0.046260239 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:25:42 compute-1 nova_compute[185910]: 2026-02-16 13:25:42.777 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:42 compute-1 nova_compute[185910]: 2026-02-16 13:25:42.998 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:47 compute-1 nova_compute[185910]: 2026-02-16 13:25:47.815 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:48 compute-1 nova_compute[185910]: 2026-02-16 13:25:48.000 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:48 compute-1 nova_compute[185910]: 2026-02-16 13:25:48.947 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:48 compute-1 nova_compute[185910]: 2026-02-16 13:25:48.948 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.014 185914 DEBUG nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.176 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.177 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.193 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.194 185914 INFO nova.compute.claims [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:25:49 compute-1 openstack_network_exporter[198096]: ERROR   13:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:25:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:25:49 compute-1 openstack_network_exporter[198096]: ERROR   13:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:25:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.468 185914 DEBUG nova.compute.provider_tree [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.488 185914 DEBUG nova.scheduler.client.report [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.549 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.550 185914 DEBUG nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.619 185914 DEBUG nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.619 185914 DEBUG nova.network.neutron [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.656 185914 INFO nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.688 185914 DEBUG nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.840 185914 DEBUG nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.842 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.842 185914 INFO nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Creating image(s)
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.843 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "/var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.843 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "/var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.844 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "/var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.860 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.904 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.905 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.906 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.921 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.968 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.970 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.996 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.997 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:49 compute-1 nova_compute[185910]: 2026-02-16 13:25:49.997 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.043 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.044 185914 DEBUG nova.virt.disk.api [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Checking if we can resize image /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.045 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.090 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.092 185914 DEBUG nova.virt.disk.api [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Cannot resize image /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.092 185914 DEBUG nova.objects.instance [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lazy-loading 'migration_context' on Instance uuid 56c9e87d-b9eb-4307-80a2-8f5bf631c74d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.112 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.113 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Ensure instance console log exists: /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.113 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.113 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.114 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:50 compute-1 nova_compute[185910]: 2026-02-16 13:25:50.500 185914 DEBUG nova.policy [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '566db36bffff4193a494fef52f968126', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '67efa696c46c451ba23d1157e0816503', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:25:51 compute-1 nova_compute[185910]: 2026-02-16 13:25:51.724 185914 DEBUG nova.network.neutron [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Successfully created port: 47879738-fc6d-440e-ab06-95bba1de09df _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:25:52 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:25:52 compute-1 nova_compute[185910]: 2026-02-16 13:25:52.817 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:53 compute-1 nova_compute[185910]: 2026-02-16 13:25:53.001 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:53 compute-1 nova_compute[185910]: 2026-02-16 13:25:53.589 185914 DEBUG nova.network.neutron [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Successfully updated port: 47879738-fc6d-440e-ab06-95bba1de09df _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:25:53 compute-1 nova_compute[185910]: 2026-02-16 13:25:53.610 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:25:53 compute-1 nova_compute[185910]: 2026-02-16 13:25:53.610 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquired lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:25:53 compute-1 nova_compute[185910]: 2026-02-16 13:25:53.611 185914 DEBUG nova.network.neutron [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:25:53 compute-1 nova_compute[185910]: 2026-02-16 13:25:53.804 185914 DEBUG nova.compute.manager [req-3b945c8e-f713-41b9-8b52-4b1963b4457a req-6f86e3a3-6c39-4100-bb2d-b6bb51071783 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Received event network-changed-47879738-fc6d-440e-ab06-95bba1de09df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:25:53 compute-1 nova_compute[185910]: 2026-02-16 13:25:53.805 185914 DEBUG nova.compute.manager [req-3b945c8e-f713-41b9-8b52-4b1963b4457a req-6f86e3a3-6c39-4100-bb2d-b6bb51071783 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Refreshing instance network info cache due to event network-changed-47879738-fc6d-440e-ab06-95bba1de09df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:25:53 compute-1 nova_compute[185910]: 2026-02-16 13:25:53.805 185914 DEBUG oslo_concurrency.lockutils [req-3b945c8e-f713-41b9-8b52-4b1963b4457a req-6f86e3a3-6c39-4100-bb2d-b6bb51071783 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:25:54 compute-1 nova_compute[185910]: 2026-02-16 13:25:54.434 185914 DEBUG nova.network.neutron [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.806 185914 DEBUG nova.network.neutron [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Updating instance_info_cache with network_info: [{"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.836 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Releasing lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.836 185914 DEBUG nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Instance network_info: |[{"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.837 185914 DEBUG oslo_concurrency.lockutils [req-3b945c8e-f713-41b9-8b52-4b1963b4457a req-6f86e3a3-6c39-4100-bb2d-b6bb51071783 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.837 185914 DEBUG nova.network.neutron [req-3b945c8e-f713-41b9-8b52-4b1963b4457a req-6f86e3a3-6c39-4100-bb2d-b6bb51071783 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Refreshing network info cache for port 47879738-fc6d-440e-ab06-95bba1de09df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.841 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Start _get_guest_xml network_info=[{"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.846 185914 WARNING nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.855 185914 DEBUG nova.virt.libvirt.host [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.856 185914 DEBUG nova.virt.libvirt.host [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.859 185914 DEBUG nova.virt.libvirt.host [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.859 185914 DEBUG nova.virt.libvirt.host [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.860 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.860 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.861 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.861 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.861 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.861 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.862 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.862 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.862 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.862 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.862 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.863 185914 DEBUG nova.virt.hardware [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.866 185914 DEBUG nova.virt.libvirt.vif [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:25:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-645930396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-645930396',id=7,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67efa696c46c451ba23d1157e0816503',ramdisk_id='',reservation_id='r-tyu7y93f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2074109192',owner_user_name='tempest-TestExecuteBasicStrategy-2074109192-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:25:49Z,user_data=None,user_id='566db36bffff4193a494fef52f968126',uuid=56c9e87d-b9eb-4307-80a2-8f5bf631c74d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.866 185914 DEBUG nova.network.os_vif_util [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converting VIF {"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.867 185914 DEBUG nova.network.os_vif_util [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:e3:68,bridge_name='br-int',has_traffic_filtering=True,id=47879738-fc6d-440e-ab06-95bba1de09df,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47879738-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.867 185914 DEBUG nova.objects.instance [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lazy-loading 'pci_devices' on Instance uuid 56c9e87d-b9eb-4307-80a2-8f5bf631c74d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.886 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <uuid>56c9e87d-b9eb-4307-80a2-8f5bf631c74d</uuid>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <name>instance-00000007</name>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteBasicStrategy-server-645930396</nova:name>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:25:56</nova:creationTime>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:25:56 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:25:56 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:25:56 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:25:56 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:25:56 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:25:56 compute-1 nova_compute[185910]:         <nova:user uuid="566db36bffff4193a494fef52f968126">tempest-TestExecuteBasicStrategy-2074109192-project-member</nova:user>
Feb 16 13:25:56 compute-1 nova_compute[185910]:         <nova:project uuid="67efa696c46c451ba23d1157e0816503">tempest-TestExecuteBasicStrategy-2074109192</nova:project>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:25:56 compute-1 nova_compute[185910]:         <nova:port uuid="47879738-fc6d-440e-ab06-95bba1de09df">
Feb 16 13:25:56 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <system>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <entry name="serial">56c9e87d-b9eb-4307-80a2-8f5bf631c74d</entry>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <entry name="uuid">56c9e87d-b9eb-4307-80a2-8f5bf631c74d</entry>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     </system>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <os>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   </os>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <features>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   </features>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk.config"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:92:e3:68"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <target dev="tap47879738-fc"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/console.log" append="off"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <video>
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     </video>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:25:56 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:25:56 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:25:56 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:25:56 compute-1 nova_compute[185910]: </domain>
Feb 16 13:25:56 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.887 185914 DEBUG nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Preparing to wait for external event network-vif-plugged-47879738-fc6d-440e-ab06-95bba1de09df prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.887 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.888 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.888 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.888 185914 DEBUG nova.virt.libvirt.vif [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:25:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-645930396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-645930396',id=7,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67efa696c46c451ba23d1157e0816503',ramdisk_id='',reservation_id='r-tyu7y93f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2074109192',owner_user_name='tempest-TestExecuteBasicStrategy-2074109192-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:25:49Z,user_data=None,user_id='566db36bffff4193a494fef52f968126',uuid=56c9e87d-b9eb-4307-80a2-8f5bf631c74d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.889 185914 DEBUG nova.network.os_vif_util [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converting VIF {"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.889 185914 DEBUG nova.network.os_vif_util [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:e3:68,bridge_name='br-int',has_traffic_filtering=True,id=47879738-fc6d-440e-ab06-95bba1de09df,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47879738-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.890 185914 DEBUG os_vif [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:e3:68,bridge_name='br-int',has_traffic_filtering=True,id=47879738-fc6d-440e-ab06-95bba1de09df,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47879738-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.890 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.890 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.891 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.893 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.893 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47879738-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.894 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47879738-fc, col_values=(('external_ids', {'iface-id': '47879738-fc6d-440e-ab06-95bba1de09df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:e3:68', 'vm-uuid': '56c9e87d-b9eb-4307-80a2-8f5bf631c74d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.895 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:56 compute-1 NetworkManager[56388]: <info>  [1771248356.8967] manager: (tap47879738-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.899 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.900 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.901 185914 INFO os_vif [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:e3:68,bridge_name='br-int',has_traffic_filtering=True,id=47879738-fc6d-440e-ab06-95bba1de09df,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47879738-fc')
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.985 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.986 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.986 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] No VIF found with MAC fa:16:3e:92:e3:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:25:56 compute-1 nova_compute[185910]: 2026-02-16 13:25:56.986 185914 INFO nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Using config drive
Feb 16 13:25:57 compute-1 nova_compute[185910]: 2026-02-16 13:25:57.766 185914 INFO nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Creating config drive at /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk.config
Feb 16 13:25:57 compute-1 nova_compute[185910]: 2026-02-16 13:25:57.770 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjgj5gdu0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:25:57 compute-1 nova_compute[185910]: 2026-02-16 13:25:57.888 185914 DEBUG oslo_concurrency.processutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjgj5gdu0" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:25:57 compute-1 kernel: tap47879738-fc: entered promiscuous mode
Feb 16 13:25:57 compute-1 NetworkManager[56388]: <info>  [1771248357.9434] manager: (tap47879738-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Feb 16 13:25:57 compute-1 ovn_controller[96285]: 2026-02-16T13:25:57Z|00060|binding|INFO|Claiming lport 47879738-fc6d-440e-ab06-95bba1de09df for this chassis.
Feb 16 13:25:57 compute-1 nova_compute[185910]: 2026-02-16 13:25:57.943 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:57 compute-1 ovn_controller[96285]: 2026-02-16T13:25:57Z|00061|binding|INFO|47879738-fc6d-440e-ab06-95bba1de09df: Claiming fa:16:3e:92:e3:68 10.100.0.10
Feb 16 13:25:57 compute-1 nova_compute[185910]: 2026-02-16 13:25:57.951 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:57 compute-1 ovn_controller[96285]: 2026-02-16T13:25:57Z|00062|binding|INFO|Setting lport 47879738-fc6d-440e-ab06-95bba1de09df ovn-installed in OVS
Feb 16 13:25:57 compute-1 ovn_controller[96285]: 2026-02-16T13:25:57Z|00063|binding|INFO|Setting lport 47879738-fc6d-440e-ab06-95bba1de09df up in Southbound
Feb 16 13:25:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:57.967 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:e3:68 10.100.0.10'], port_security=['fa:16:3e:92:e3:68 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '56c9e87d-b9eb-4307-80a2-8f5bf631c74d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67efa696c46c451ba23d1157e0816503', 'neutron:revision_number': '2', 'neutron:security_group_ids': '12b2bd84-2289-4cae-bae0-edbc0fcc8f32', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2457a351-b1fe-40a0-b007-96d97766c2c9, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=47879738-fc6d-440e-ab06-95bba1de09df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:25:57 compute-1 systemd-udevd[208258]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:25:57 compute-1 nova_compute[185910]: 2026-02-16 13:25:57.969 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:57.969 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 47879738-fc6d-440e-ab06-95bba1de09df in datapath 85f5abea-ac25-4244-a69b-79e29b2ba1fc bound to our chassis
Feb 16 13:25:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:57.971 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 85f5abea-ac25-4244-a69b-79e29b2ba1fc
Feb 16 13:25:57 compute-1 systemd-machined[155419]: New machine qemu-5-instance-00000007.
Feb 16 13:25:57 compute-1 NetworkManager[56388]: <info>  [1771248357.9834] device (tap47879738-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:25:57 compute-1 NetworkManager[56388]: <info>  [1771248357.9839] device (tap47879738-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:25:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:57.982 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[66ee528d-f55f-41dd-9b68-7044b2f1bb43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:57.983 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap85f5abea-a1 in ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:25:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:57.985 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap85f5abea-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:25:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:57.985 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[21f6e8b3-2bf9-4e65-b6d6-969ea3bdebfd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:57.986 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[2384dd79-3494-4c00-b763-5624f712b49c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:57.993 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[5021c795-0773-4907-b991-f23507b25acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:57 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-00000007.
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.003 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.012 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[98c748ba-fbd8-4963-8b9e-e3810ba7ee65]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.032 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[25186241-d60f-44b6-bb6b-4b704508daf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.037 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[727cbea3-aacd-4ce6-bceb-e650e72f82a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 NetworkManager[56388]: <info>  [1771248358.0383] manager: (tap85f5abea-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.054 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[b7dfa324-04bc-446d-b783-9a18dd63ef1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.058 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[73d1ac81-4ac3-463c-91be-a47c303295d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 NetworkManager[56388]: <info>  [1771248358.0706] device (tap85f5abea-a0): carrier: link connected
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.072 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[266ee5d7-0ed9-4cec-8bd8-b6642db8fdd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.083 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c59731e0-0e85-4ca9-8c9c-6aaf68b0ab9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85f5abea-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9c:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444249, 'reachable_time': 41195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208295, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.092 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5e4ff0-5177-4d98-ba44-43c20a2890e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:9c74'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444249, 'tstamp': 444249}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208296, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.102 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[b91a2abf-dfb4-4830-88f8-cd66ee7df3af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85f5abea-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9c:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444249, 'reachable_time': 41195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208297, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.119 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[214d1b84-f177-452d-a3fb-1f546d89f53a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.160 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf5b215-fafd-457a-9971-0bb6e7ce38c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.161 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85f5abea-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.162 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.162 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85f5abea-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:25:58 compute-1 NetworkManager[56388]: <info>  [1771248358.2041] manager: (tap85f5abea-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.203 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:58 compute-1 kernel: tap85f5abea-a0: entered promiscuous mode
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.207 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.208 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap85f5abea-a0, col_values=(('external_ids', {'iface-id': 'd3feb028-a3ab-4f43-8a4b-1ee3054fd9f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.209 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:58 compute-1 ovn_controller[96285]: 2026-02-16T13:25:58Z|00064|binding|INFO|Releasing lport d3feb028-a3ab-4f43-8a4b-1ee3054fd9f1 from this chassis (sb_readonly=0)
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.212 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/85f5abea-ac25-4244-a69b-79e29b2ba1fc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/85f5abea-ac25-4244-a69b-79e29b2ba1fc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.214 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.213 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[6fd61585-e02c-4588-9e21-37f60af697b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.214 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-85f5abea-ac25-4244-a69b-79e29b2ba1fc
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/85f5abea-ac25-4244-a69b-79e29b2ba1fc.pid.haproxy
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 85f5abea-ac25-4244-a69b-79e29b2ba1fc
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:25:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:25:58.214 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'env', 'PROCESS_TAG=haproxy-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/85f5abea-ac25-4244-a69b-79e29b2ba1fc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.304 185914 DEBUG nova.compute.manager [req-716e2151-dfb7-4a8c-8139-d3321f88e73b req-2a384d7d-04e8-46ac-b58a-0aea363aabc2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Received event network-vif-plugged-47879738-fc6d-440e-ab06-95bba1de09df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.305 185914 DEBUG oslo_concurrency.lockutils [req-716e2151-dfb7-4a8c-8139-d3321f88e73b req-2a384d7d-04e8-46ac-b58a-0aea363aabc2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.305 185914 DEBUG oslo_concurrency.lockutils [req-716e2151-dfb7-4a8c-8139-d3321f88e73b req-2a384d7d-04e8-46ac-b58a-0aea363aabc2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.306 185914 DEBUG oslo_concurrency.lockutils [req-716e2151-dfb7-4a8c-8139-d3321f88e73b req-2a384d7d-04e8-46ac-b58a-0aea363aabc2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.306 185914 DEBUG nova.compute.manager [req-716e2151-dfb7-4a8c-8139-d3321f88e73b req-2a384d7d-04e8-46ac-b58a-0aea363aabc2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Processing event network-vif-plugged-47879738-fc6d-440e-ab06-95bba1de09df _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:25:58 compute-1 podman[208329]: 2026-02-16 13:25:58.538625643 +0000 UTC m=+0.050998826 container create 6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:25:58 compute-1 systemd[1]: Started libpod-conmon-6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d.scope.
Feb 16 13:25:58 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:25:58 compute-1 podman[208329]: 2026-02-16 13:25:58.509612135 +0000 UTC m=+0.021985348 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:25:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eee320d88c8ac82ea860386a3dc5f31038f9b2809b7c6402d8cdcadf0a1703e2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:25:58 compute-1 podman[208329]: 2026-02-16 13:25:58.620948329 +0000 UTC m=+0.133321542 container init 6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:25:58 compute-1 podman[208329]: 2026-02-16 13:25:58.62502579 +0000 UTC m=+0.137398973 container start 6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:25:58 compute-1 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208345]: [NOTICE]   (208355) : New worker (208357) forked
Feb 16 13:25:58 compute-1 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208345]: [NOTICE]   (208355) : Loading success.
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.697 185914 DEBUG nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.698 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248358.6967833, 56c9e87d-b9eb-4307-80a2-8f5bf631c74d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.700 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] VM Started (Lifecycle Event)
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.703 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.707 185914 INFO nova.virt.libvirt.driver [-] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Instance spawned successfully.
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.707 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.738 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.739 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.740 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.740 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.741 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.741 185914 DEBUG nova.virt.libvirt.driver [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.745 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.747 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.783 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.784 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248358.6984441, 56c9e87d-b9eb-4307-80a2-8f5bf631c74d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.784 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] VM Paused (Lifecycle Event)
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.820 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.823 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248358.7029698, 56c9e87d-b9eb-4307-80a2-8f5bf631c74d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.824 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] VM Resumed (Lifecycle Event)
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.834 185914 INFO nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Took 8.99 seconds to spawn the instance on the hypervisor.
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.835 185914 DEBUG nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.851 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.855 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.912 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:25:58 compute-1 nova_compute[185910]: 2026-02-16 13:25:58.963 185914 INFO nova.compute.manager [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Took 9.87 seconds to build instance.
Feb 16 13:25:59 compute-1 nova_compute[185910]: 2026-02-16 13:25:59.006 185914 DEBUG oslo_concurrency.lockutils [None req-25d0b4a3-e660-4ff0-b226-8e54730f57ca 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:25:59 compute-1 nova_compute[185910]: 2026-02-16 13:25:59.477 185914 DEBUG nova.network.neutron [req-3b945c8e-f713-41b9-8b52-4b1963b4457a req-6f86e3a3-6c39-4100-bb2d-b6bb51071783 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Updated VIF entry in instance network info cache for port 47879738-fc6d-440e-ab06-95bba1de09df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:25:59 compute-1 nova_compute[185910]: 2026-02-16 13:25:59.478 185914 DEBUG nova.network.neutron [req-3b945c8e-f713-41b9-8b52-4b1963b4457a req-6f86e3a3-6c39-4100-bb2d-b6bb51071783 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Updating instance_info_cache with network_info: [{"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:25:59 compute-1 nova_compute[185910]: 2026-02-16 13:25:59.506 185914 DEBUG oslo_concurrency.lockutils [req-3b945c8e-f713-41b9-8b52-4b1963b4457a req-6f86e3a3-6c39-4100-bb2d-b6bb51071783 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:25:59 compute-1 sshd-session[208367]: Invalid user vps from 188.166.42.159 port 51984
Feb 16 13:25:59 compute-1 podman[208370]: 2026-02-16 13:25:59.768584827 +0000 UTC m=+0.083390456 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 16 13:25:59 compute-1 podman[208369]: 2026-02-16 13:25:59.771685141 +0000 UTC m=+0.087160968 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, container_name=openstack_network_exporter)
Feb 16 13:25:59 compute-1 sshd-session[208367]: Connection closed by invalid user vps 188.166.42.159 port 51984 [preauth]
Feb 16 13:26:00 compute-1 nova_compute[185910]: 2026-02-16 13:26:00.415 185914 DEBUG nova.compute.manager [req-e2ae0a24-be93-4ebf-a1d1-b7053d6784b8 req-184eabb9-726f-4b5f-bf10-694c4dbf3fdd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Received event network-vif-plugged-47879738-fc6d-440e-ab06-95bba1de09df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:26:00 compute-1 nova_compute[185910]: 2026-02-16 13:26:00.416 185914 DEBUG oslo_concurrency.lockutils [req-e2ae0a24-be93-4ebf-a1d1-b7053d6784b8 req-184eabb9-726f-4b5f-bf10-694c4dbf3fdd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:00 compute-1 nova_compute[185910]: 2026-02-16 13:26:00.416 185914 DEBUG oslo_concurrency.lockutils [req-e2ae0a24-be93-4ebf-a1d1-b7053d6784b8 req-184eabb9-726f-4b5f-bf10-694c4dbf3fdd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:00 compute-1 nova_compute[185910]: 2026-02-16 13:26:00.417 185914 DEBUG oslo_concurrency.lockutils [req-e2ae0a24-be93-4ebf-a1d1-b7053d6784b8 req-184eabb9-726f-4b5f-bf10-694c4dbf3fdd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:00 compute-1 nova_compute[185910]: 2026-02-16 13:26:00.417 185914 DEBUG nova.compute.manager [req-e2ae0a24-be93-4ebf-a1d1-b7053d6784b8 req-184eabb9-726f-4b5f-bf10-694c4dbf3fdd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] No waiting events found dispatching network-vif-plugged-47879738-fc6d-440e-ab06-95bba1de09df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:26:00 compute-1 nova_compute[185910]: 2026-02-16 13:26:00.417 185914 WARNING nova.compute.manager [req-e2ae0a24-be93-4ebf-a1d1-b7053d6784b8 req-184eabb9-726f-4b5f-bf10-694c4dbf3fdd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Received unexpected event network-vif-plugged-47879738-fc6d-440e-ab06-95bba1de09df for instance with vm_state active and task_state None.
Feb 16 13:26:01 compute-1 nova_compute[185910]: 2026-02-16 13:26:01.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:01 compute-1 nova_compute[185910]: 2026-02-16 13:26:01.896 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:02 compute-1 nova_compute[185910]: 2026-02-16 13:26:02.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:03 compute-1 nova_compute[185910]: 2026-02-16 13:26:03.006 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:26:03.332 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:26:03.333 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:26:03.333 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:03 compute-1 nova_compute[185910]: 2026-02-16 13:26:03.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:03 compute-1 podman[208407]: 2026-02-16 13:26:03.93962495 +0000 UTC m=+0.073055195 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:26:05 compute-1 nova_compute[185910]: 2026-02-16 13:26:05.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:05 compute-1 podman[195236]: time="2026-02-16T13:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:26:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:26:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Feb 16 13:26:06 compute-1 nova_compute[185910]: 2026-02-16 13:26:06.898 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.627 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.661 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.662 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.662 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.662 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.744 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.792 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.793 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.842 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.952 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.954 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5692MB free_disk=73.22708129882812GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.955 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:26:07 compute-1 nova_compute[185910]: 2026-02-16 13:26:07.955 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.009 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.027 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 56c9e87d-b9eb-4307-80a2-8f5bf631c74d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.028 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.028 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.049 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.068 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.068 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.084 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.111 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.157 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.176 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.204 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:26:08 compute-1 nova_compute[185910]: 2026-02-16 13:26:08.205 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.207 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.208 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.439 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:26:10.440 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:26:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:26:10.441 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.634 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.635 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.808 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.809 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.809 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:26:10 compute-1 nova_compute[185910]: 2026-02-16 13:26:10.810 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 56c9e87d-b9eb-4307-80a2-8f5bf631c74d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:26:11 compute-1 nova_compute[185910]: 2026-02-16 13:26:11.901 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:11 compute-1 podman[208459]: 2026-02-16 13:26:11.916663859 +0000 UTC m=+0.054168362 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:26:12 compute-1 ovn_controller[96285]: 2026-02-16T13:26:12Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:92:e3:68 10.100.0.10
Feb 16 13:26:12 compute-1 ovn_controller[96285]: 2026-02-16T13:26:12Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:92:e3:68 10.100.0.10
Feb 16 13:26:13 compute-1 nova_compute[185910]: 2026-02-16 13:26:13.009 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:13 compute-1 nova_compute[185910]: 2026-02-16 13:26:13.485 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Updating instance_info_cache with network_info: [{"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:26:13 compute-1 nova_compute[185910]: 2026-02-16 13:26:13.516 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:26:13 compute-1 nova_compute[185910]: 2026-02-16 13:26:13.517 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:26:14 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:26:14.443 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:26:15 compute-1 sshd-session[208483]: Connection closed by authenticating user root 146.190.226.24 port 37760 [preauth]
Feb 16 13:26:16 compute-1 nova_compute[185910]: 2026-02-16 13:26:16.903 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:18 compute-1 nova_compute[185910]: 2026-02-16 13:26:18.012 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:19 compute-1 openstack_network_exporter[198096]: ERROR   13:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:26:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:26:19 compute-1 openstack_network_exporter[198096]: ERROR   13:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:26:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:26:21 compute-1 nova_compute[185910]: 2026-02-16 13:26:21.905 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:23 compute-1 nova_compute[185910]: 2026-02-16 13:26:23.013 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:26 compute-1 nova_compute[185910]: 2026-02-16 13:26:26.907 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:28 compute-1 nova_compute[185910]: 2026-02-16 13:26:28.015 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:29 compute-1 podman[208485]: 2026-02-16 13:26:29.912898183 +0000 UTC m=+0.052500297 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, release=1770267347, distribution-scope=public)
Feb 16 13:26:29 compute-1 podman[208486]: 2026-02-16 13:26:29.921449085 +0000 UTC m=+0.049259629 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 16 13:26:31 compute-1 nova_compute[185910]: 2026-02-16 13:26:31.909 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:33 compute-1 nova_compute[185910]: 2026-02-16 13:26:33.017 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:34 compute-1 podman[208527]: 2026-02-16 13:26:34.934450804 +0000 UTC m=+0.067974587 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 16 13:26:35 compute-1 podman[195236]: time="2026-02-16T13:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:26:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:26:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2630 "" "Go-http-client/1.1"
Feb 16 13:26:36 compute-1 nova_compute[185910]: 2026-02-16 13:26:36.911 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:38 compute-1 nova_compute[185910]: 2026-02-16 13:26:38.021 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:39 compute-1 sshd-session[208553]: Invalid user sol from 2.57.122.210 port 47182
Feb 16 13:26:39 compute-1 sshd-session[208553]: Connection closed by invalid user sol 2.57.122.210 port 47182 [preauth]
Feb 16 13:26:40 compute-1 ovn_controller[96285]: 2026-02-16T13:26:40Z|00065|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Feb 16 13:26:41 compute-1 nova_compute[185910]: 2026-02-16 13:26:41.913 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:42 compute-1 podman[208555]: 2026-02-16 13:26:42.927723157 +0000 UTC m=+0.051011807 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:26:43 compute-1 nova_compute[185910]: 2026-02-16 13:26:43.022 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:46 compute-1 nova_compute[185910]: 2026-02-16 13:26:46.915 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:48 compute-1 nova_compute[185910]: 2026-02-16 13:26:48.024 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:49 compute-1 openstack_network_exporter[198096]: ERROR   13:26:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:26:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:26:49 compute-1 openstack_network_exporter[198096]: ERROR   13:26:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:26:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:26:51 compute-1 nova_compute[185910]: 2026-02-16 13:26:51.917 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:53 compute-1 nova_compute[185910]: 2026-02-16 13:26:53.025 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:54 compute-1 sshd-session[208579]: Invalid user testuser from 188.166.42.159 port 38548
Feb 16 13:26:54 compute-1 sshd-session[208579]: Connection closed by invalid user testuser 188.166.42.159 port 38548 [preauth]
Feb 16 13:26:56 compute-1 nova_compute[185910]: 2026-02-16 13:26:56.919 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:26:58 compute-1 nova_compute[185910]: 2026-02-16 13:26:58.027 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:00 compute-1 podman[208585]: 2026-02-16 13:27:00.911518984 +0000 UTC m=+0.049153426 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:27:00 compute-1 podman[208584]: 2026-02-16 13:27:00.940018618 +0000 UTC m=+0.083459678 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7, maintainer=Red Hat, Inc., release=1770267347, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z)
Feb 16 13:27:01 compute-1 nova_compute[185910]: 2026-02-16 13:27:01.920 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:02 compute-1 nova_compute[185910]: 2026-02-16 13:27:02.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:03 compute-1 nova_compute[185910]: 2026-02-16 13:27:03.051 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:03.333 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:03.333 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:03.334 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:03 compute-1 nova_compute[185910]: 2026-02-16 13:27:03.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:04 compute-1 nova_compute[185910]: 2026-02-16 13:27:04.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:05 compute-1 podman[195236]: time="2026-02-16T13:27:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:27:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:27:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:27:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:27:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Feb 16 13:27:05 compute-1 podman[208625]: 2026-02-16 13:27:05.956131693 +0000 UTC m=+0.094021004 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 13:27:06 compute-1 nova_compute[185910]: 2026-02-16 13:27:06.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:06 compute-1 nova_compute[185910]: 2026-02-16 13:27:06.924 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.094 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.701 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.702 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.702 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.703 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.809 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.859 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.861 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:08 compute-1 nova_compute[185910]: 2026-02-16 13:27:08.931 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.082 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.084 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5674MB free_disk=73.19895935058594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.084 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.084 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.219 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 56c9e87d-b9eb-4307-80a2-8f5bf631c74d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.220 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.220 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.287 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.355 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.357 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:27:09 compute-1 nova_compute[185910]: 2026-02-16 13:27:09.358 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:10 compute-1 nova_compute[185910]: 2026-02-16 13:27:10.353 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:10 compute-1 nova_compute[185910]: 2026-02-16 13:27:10.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:10 compute-1 nova_compute[185910]: 2026-02-16 13:27:10.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:27:10 compute-1 nova_compute[185910]: 2026-02-16 13:27:10.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:27:11 compute-1 nova_compute[185910]: 2026-02-16 13:27:11.521 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:27:11 compute-1 nova_compute[185910]: 2026-02-16 13:27:11.521 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:27:11 compute-1 nova_compute[185910]: 2026-02-16 13:27:11.522 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:27:11 compute-1 nova_compute[185910]: 2026-02-16 13:27:11.522 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 56c9e87d-b9eb-4307-80a2-8f5bf631c74d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:27:11 compute-1 nova_compute[185910]: 2026-02-16 13:27:11.929 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:13 compute-1 nova_compute[185910]: 2026-02-16 13:27:13.097 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:13 compute-1 nova_compute[185910]: 2026-02-16 13:27:13.715 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Updating instance_info_cache with network_info: [{"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:27:13 compute-1 nova_compute[185910]: 2026-02-16 13:27:13.736 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-56c9e87d-b9eb-4307-80a2-8f5bf631c74d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:27:13 compute-1 nova_compute[185910]: 2026-02-16 13:27:13.736 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:27:13 compute-1 nova_compute[185910]: 2026-02-16 13:27:13.737 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:13 compute-1 nova_compute[185910]: 2026-02-16 13:27:13.737 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:27:13 compute-1 podman[208658]: 2026-02-16 13:27:13.905718668 +0000 UTC m=+0.044103579 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:27:16 compute-1 nova_compute[185910]: 2026-02-16 13:27:16.731 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:27:16 compute-1 nova_compute[185910]: 2026-02-16 13:27:16.933 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:18 compute-1 nova_compute[185910]: 2026-02-16 13:27:18.138 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:19 compute-1 openstack_network_exporter[198096]: ERROR   13:27:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:27:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:27:19 compute-1 openstack_network_exporter[198096]: ERROR   13:27:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:27:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:27:21 compute-1 nova_compute[185910]: 2026-02-16 13:27:21.965 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:22 compute-1 sshd-session[208683]: Connection closed by authenticating user root 146.190.226.24 port 38532 [preauth]
Feb 16 13:27:23 compute-1 nova_compute[185910]: 2026-02-16 13:27:23.175 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:26 compute-1 nova_compute[185910]: 2026-02-16 13:27:26.968 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:28 compute-1 nova_compute[185910]: 2026-02-16 13:27:28.178 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:31 compute-1 podman[208685]: 2026-02-16 13:27:31.909808926 +0000 UTC m=+0.049186527 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible)
Feb 16 13:27:31 compute-1 podman[208686]: 2026-02-16 13:27:31.91768362 +0000 UTC m=+0.052295972 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 16 13:27:32 compute-1 nova_compute[185910]: 2026-02-16 13:27:32.001 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:33 compute-1 nova_compute[185910]: 2026-02-16 13:27:33.179 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:35 compute-1 podman[195236]: time="2026-02-16T13:27:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:27:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:27:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:27:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:27:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 16 13:27:36 compute-1 podman[208725]: 2026-02-16 13:27:36.93219411 +0000 UTC m=+0.076701755 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Feb 16 13:27:37 compute-1 nova_compute[185910]: 2026-02-16 13:27:37.003 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:37 compute-1 nova_compute[185910]: 2026-02-16 13:27:37.423 185914 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Creating tmpfile /var/lib/nova/instances/tmppe2llmuf to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:27:37 compute-1 nova_compute[185910]: 2026-02-16 13:27:37.424 185914 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe2llmuf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:27:38 compute-1 nova_compute[185910]: 2026-02-16 13:27:38.219 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:40 compute-1 nova_compute[185910]: 2026-02-16 13:27:40.085 185914 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe2llmuf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c6353280-0641-466d-9963-30eb530755e9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:27:40 compute-1 nova_compute[185910]: 2026-02-16 13:27:40.130 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:27:40 compute-1 nova_compute[185910]: 2026-02-16 13:27:40.131 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:27:40 compute-1 nova_compute[185910]: 2026-02-16 13:27:40.131 185914 DEBUG nova.network.neutron [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:27:42 compute-1 nova_compute[185910]: 2026-02-16 13:27:42.007 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:43 compute-1 nova_compute[185910]: 2026-02-16 13:27:43.221 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:44 compute-1 podman[208751]: 2026-02-16 13:27:44.913014761 +0000 UTC m=+0.051379624 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.011 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.842 185914 DEBUG nova.network.neutron [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Updating instance_info_cache with network_info: [{"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.892 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.894 185914 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe2llmuf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c6353280-0641-466d-9963-30eb530755e9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.895 185914 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Creating instance directory: /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.896 185914 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Creating disk.info with the contents: {'/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk': 'qcow2', '/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.896 185914 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.897 185914 DEBUG nova.objects.instance [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid c6353280-0641-466d-9963-30eb530755e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.931 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.978 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.978 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.979 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:47 compute-1 nova_compute[185910]: 2026-02-16 13:27:47.989 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.038 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.039 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.065 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk 1073741824" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.066 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.067 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.114 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.115 185914 DEBUG nova.virt.disk.api [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.115 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.170 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.171 185914 DEBUG nova.virt.disk.api [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.171 185914 DEBUG nova.objects.instance [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid c6353280-0641-466d-9963-30eb530755e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.210 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.225 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.232 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config 485376" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.233 185914 DEBUG nova.virt.libvirt.volume.remotefs [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config to /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.233 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.642 185914 DEBUG oslo_concurrency.processutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9/disk.config /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.643 185914 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.644 185914 DEBUG nova.virt.libvirt.vif [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:26:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-92456537',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-92456537',id=8,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:26:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67efa696c46c451ba23d1157e0816503',ramdisk_id='',reservation_id='r-d5o7uffx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2074109192',owner_user_name='tempest-TestExecuteBasicStrategy-2074109192-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:26:15Z,user_data=None,user_id='566db36bffff4193a494fef52f968126',uuid=c6353280-0641-466d-9963-30eb530755e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.644 185914 DEBUG nova.network.os_vif_util [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.645 185914 DEBUG nova.network.os_vif_util [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.646 185914 DEBUG os_vif [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.646 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.647 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.647 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.650 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.651 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68d12bd9-0c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.651 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap68d12bd9-0c, col_values=(('external_ids', {'iface-id': '68d12bd9-0c21-41b6-b775-1de285c4be2c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:c3:87', 'vm-uuid': 'c6353280-0641-466d-9963-30eb530755e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.653 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:48 compute-1 NetworkManager[56388]: <info>  [1771248468.6541] manager: (tap68d12bd9-0c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.655 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.658 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.660 185914 INFO os_vif [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c')
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.660 185914 DEBUG nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:27:48 compute-1 nova_compute[185910]: 2026-02-16 13:27:48.661 185914 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe2llmuf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c6353280-0641-466d-9963-30eb530755e9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:27:48 compute-1 sshd-session[208795]: Invalid user server from 188.166.42.159 port 59168
Feb 16 13:27:48 compute-1 sshd-session[208795]: Connection closed by invalid user server 188.166.42.159 port 59168 [preauth]
Feb 16 13:27:49 compute-1 openstack_network_exporter[198096]: ERROR   13:27:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:27:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:27:49 compute-1 openstack_network_exporter[198096]: ERROR   13:27:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:27:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:27:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:50.243 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:27:50 compute-1 nova_compute[185910]: 2026-02-16 13:27:50.243 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:50.244 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:27:50 compute-1 nova_compute[185910]: 2026-02-16 13:27:50.960 185914 DEBUG nova.network.neutron [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Port 68d12bd9-0c21-41b6-b775-1de285c4be2c updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:27:50 compute-1 nova_compute[185910]: 2026-02-16 13:27:50.962 185914 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppe2llmuf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c6353280-0641-466d-9963-30eb530755e9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:27:51 compute-1 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:27:51 compute-1 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:27:51 compute-1 kernel: tap68d12bd9-0c: entered promiscuous mode
Feb 16 13:27:51 compute-1 NetworkManager[56388]: <info>  [1771248471.2706] manager: (tap68d12bd9-0c): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Feb 16 13:27:51 compute-1 ovn_controller[96285]: 2026-02-16T13:27:51Z|00066|binding|INFO|Claiming lport 68d12bd9-0c21-41b6-b775-1de285c4be2c for this additional chassis.
Feb 16 13:27:51 compute-1 ovn_controller[96285]: 2026-02-16T13:27:51Z|00067|binding|INFO|68d12bd9-0c21-41b6-b775-1de285c4be2c: Claiming fa:16:3e:44:c3:87 10.100.0.5
Feb 16 13:27:51 compute-1 nova_compute[185910]: 2026-02-16 13:27:51.271 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:51 compute-1 ovn_controller[96285]: 2026-02-16T13:27:51Z|00068|binding|INFO|Setting lport 68d12bd9-0c21-41b6-b775-1de285c4be2c ovn-installed in OVS
Feb 16 13:27:51 compute-1 nova_compute[185910]: 2026-02-16 13:27:51.279 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:51 compute-1 nova_compute[185910]: 2026-02-16 13:27:51.281 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:51 compute-1 systemd-udevd[208832]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:27:51 compute-1 systemd-machined[155419]: New machine qemu-6-instance-00000008.
Feb 16 13:27:51 compute-1 NetworkManager[56388]: <info>  [1771248471.3147] device (tap68d12bd9-0c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:27:51 compute-1 NetworkManager[56388]: <info>  [1771248471.3152] device (tap68d12bd9-0c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:27:51 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Feb 16 13:27:51 compute-1 nova_compute[185910]: 2026-02-16 13:27:51.888 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248471.887359, c6353280-0641-466d-9963-30eb530755e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:27:51 compute-1 nova_compute[185910]: 2026-02-16 13:27:51.891 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] VM Started (Lifecycle Event)
Feb 16 13:27:51 compute-1 nova_compute[185910]: 2026-02-16 13:27:51.922 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:27:52 compute-1 nova_compute[185910]: 2026-02-16 13:27:52.706 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248472.7056763, c6353280-0641-466d-9963-30eb530755e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:27:52 compute-1 nova_compute[185910]: 2026-02-16 13:27:52.707 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] VM Resumed (Lifecycle Event)
Feb 16 13:27:52 compute-1 nova_compute[185910]: 2026-02-16 13:27:52.745 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:27:52 compute-1 nova_compute[185910]: 2026-02-16 13:27:52.750 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:27:52 compute-1 nova_compute[185910]: 2026-02-16 13:27:52.798 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Feb 16 13:27:53 compute-1 nova_compute[185910]: 2026-02-16 13:27:53.227 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:53 compute-1 nova_compute[185910]: 2026-02-16 13:27:53.653 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:57 compute-1 ovn_controller[96285]: 2026-02-16T13:27:57Z|00069|binding|INFO|Claiming lport 68d12bd9-0c21-41b6-b775-1de285c4be2c for this chassis.
Feb 16 13:27:57 compute-1 ovn_controller[96285]: 2026-02-16T13:27:57Z|00070|binding|INFO|68d12bd9-0c21-41b6-b775-1de285c4be2c: Claiming fa:16:3e:44:c3:87 10.100.0.5
Feb 16 13:27:57 compute-1 ovn_controller[96285]: 2026-02-16T13:27:57Z|00071|binding|INFO|Setting lport 68d12bd9-0c21-41b6-b775-1de285c4be2c up in Southbound
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.636 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:c3:87 10.100.0.5'], port_security=['fa:16:3e:44:c3:87 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c6353280-0641-466d-9963-30eb530755e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67efa696c46c451ba23d1157e0816503', 'neutron:revision_number': '10', 'neutron:security_group_ids': '12b2bd84-2289-4cae-bae0-edbc0fcc8f32', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2457a351-b1fe-40a0-b007-96d97766c2c9, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=68d12bd9-0c21-41b6-b775-1de285c4be2c) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.638 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 68d12bd9-0c21-41b6-b775-1de285c4be2c in datapath 85f5abea-ac25-4244-a69b-79e29b2ba1fc bound to our chassis
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.641 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 85f5abea-ac25-4244-a69b-79e29b2ba1fc
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.664 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb4c01b-294b-4a96-8c64-1fb84b364bb5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.697 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d5e071-e751-424d-aae5-e13454d0ae09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.701 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[512b2141-5b08-4386-9dc1-d0a80bd260be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.724 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[156fdc2b-c20f-47a2-9b86-a2093549f150]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.744 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[acc7bfa9-af9b-4470-9284-a8be88168149]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85f5abea-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9c:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 6, 'rx_bytes': 826, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444249, 'reachable_time': 41195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208866, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.762 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5531afbd-eba1-4b06-9adc-be9f338df543]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap85f5abea-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444255, 'tstamp': 444255}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208867, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap85f5abea-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444257, 'tstamp': 444257}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208867, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.766 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85f5abea-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:57 compute-1 nova_compute[185910]: 2026-02-16 13:27:57.768 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.770 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85f5abea-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.770 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.770 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap85f5abea-a0, col_values=(('external_ids', {'iface-id': 'd3feb028-a3ab-4f43-8a4b-1ee3054fd9f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:57 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:57.771 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:27:57 compute-1 nova_compute[185910]: 2026-02-16 13:27:57.886 185914 INFO nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Post operation of migration started
Feb 16 13:27:58 compute-1 nova_compute[185910]: 2026-02-16 13:27:58.230 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:58 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:27:58.246 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:27:58 compute-1 nova_compute[185910]: 2026-02-16 13:27:58.361 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:27:58 compute-1 nova_compute[185910]: 2026-02-16 13:27:58.362 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:27:58 compute-1 nova_compute[185910]: 2026-02-16 13:27:58.362 185914 DEBUG nova.network.neutron [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:27:58 compute-1 nova_compute[185910]: 2026-02-16 13:27:58.655 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:27:59 compute-1 nova_compute[185910]: 2026-02-16 13:27:59.703 185914 DEBUG nova.network.neutron [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Updating instance_info_cache with network_info: [{"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:27:59 compute-1 nova_compute[185910]: 2026-02-16 13:27:59.738 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-c6353280-0641-466d-9963-30eb530755e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:27:59 compute-1 nova_compute[185910]: 2026-02-16 13:27:59.771 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:27:59 compute-1 nova_compute[185910]: 2026-02-16 13:27:59.772 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:27:59 compute-1 nova_compute[185910]: 2026-02-16 13:27:59.772 185914 DEBUG oslo_concurrency.lockutils [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:27:59 compute-1 nova_compute[185910]: 2026-02-16 13:27:59.777 185914 INFO nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:27:59 compute-1 virtqemud[185025]: Domain id=6 name='instance-00000008' uuid=c6353280-0641-466d-9963-30eb530755e9 is tainted: custom-monitor
Feb 16 13:28:00 compute-1 nova_compute[185910]: 2026-02-16 13:28:00.786 185914 INFO nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:28:01 compute-1 nova_compute[185910]: 2026-02-16 13:28:01.792 185914 INFO nova.virt.libvirt.driver [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:28:01 compute-1 nova_compute[185910]: 2026-02-16 13:28:01.798 185914 DEBUG nova.compute.manager [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:28:01 compute-1 nova_compute[185910]: 2026-02-16 13:28:01.827 185914 DEBUG nova.objects.instance [None req-2c134511-64a1-4c90-949d-09cb96b35f93 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:28:02 compute-1 podman[208869]: 2026-02-16 13:28:02.912255501 +0000 UTC m=+0.051346043 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:28:02 compute-1 podman[208868]: 2026-02-16 13:28:02.918867327 +0000 UTC m=+0.058563986 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1770267347, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, architecture=x86_64, build-date=2026-02-05T04:57:10Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:28:03 compute-1 nova_compute[185910]: 2026-02-16 13:28:03.231 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:03.334 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:03.335 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:03.336 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:03 compute-1 nova_compute[185910]: 2026-02-16 13:28:03.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:03 compute-1 nova_compute[185910]: 2026-02-16 13:28:03.657 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:04 compute-1 nova_compute[185910]: 2026-02-16 13:28:04.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:04 compute-1 nova_compute[185910]: 2026-02-16 13:28:04.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:05 compute-1 podman[195236]: time="2026-02-16T13:28:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:28:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:28:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:28:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:28:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2634 "" "Go-http-client/1.1"
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.864 185914 DEBUG oslo_concurrency.lockutils [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.865 185914 DEBUG oslo_concurrency.lockutils [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.865 185914 DEBUG oslo_concurrency.lockutils [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.866 185914 DEBUG oslo_concurrency.lockutils [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.866 185914 DEBUG oslo_concurrency.lockutils [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.867 185914 INFO nova.compute.manager [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Terminating instance
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.869 185914 DEBUG nova.compute.manager [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:28:05 compute-1 kernel: tap68d12bd9-0c (unregistering): left promiscuous mode
Feb 16 13:28:05 compute-1 NetworkManager[56388]: <info>  [1771248485.8932] device (tap68d12bd9-0c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:28:05 compute-1 ovn_controller[96285]: 2026-02-16T13:28:05Z|00072|binding|INFO|Releasing lport 68d12bd9-0c21-41b6-b775-1de285c4be2c from this chassis (sb_readonly=0)
Feb 16 13:28:05 compute-1 ovn_controller[96285]: 2026-02-16T13:28:05Z|00073|binding|INFO|Setting lport 68d12bd9-0c21-41b6-b775-1de285c4be2c down in Southbound
Feb 16 13:28:05 compute-1 ovn_controller[96285]: 2026-02-16T13:28:05Z|00074|binding|INFO|Removing iface tap68d12bd9-0c ovn-installed in OVS
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.899 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.902 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:05 compute-1 nova_compute[185910]: 2026-02-16 13:28:05.905 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:05.907 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:c3:87 10.100.0.5'], port_security=['fa:16:3e:44:c3:87 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c6353280-0641-466d-9963-30eb530755e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67efa696c46c451ba23d1157e0816503', 'neutron:revision_number': '12', 'neutron:security_group_ids': '12b2bd84-2289-4cae-bae0-edbc0fcc8f32', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2457a351-b1fe-40a0-b007-96d97766c2c9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=68d12bd9-0c21-41b6-b775-1de285c4be2c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:28:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:05.909 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 68d12bd9-0c21-41b6-b775-1de285c4be2c in datapath 85f5abea-ac25-4244-a69b-79e29b2ba1fc unbound from our chassis
Feb 16 13:28:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:05.911 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 85f5abea-ac25-4244-a69b-79e29b2ba1fc
Feb 16 13:28:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:05.923 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f90327b4-fc72-4b21-997d-479840e96a23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:05 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 16 13:28:05 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 1.557s CPU time.
Feb 16 13:28:05 compute-1 systemd-machined[155419]: Machine qemu-6-instance-00000008 terminated.
Feb 16 13:28:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:05.948 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e4ebc8-2dde-4298-abb5-8927f024e70a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:05.953 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[b7bcf366-4c55-47cb-808a-904496c267a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:05.972 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[14b71777-d1c1-4ba6-af4a-843f1647c97f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:05 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:05.989 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[28798ab1-6493-452b-a43b-8b36b5d0c524]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85f5abea-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9c:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444249, 'reachable_time': 41195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208920, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:06 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:05.999 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[9a9769ff-d6e4-49ca-ba47-cd31779ad2b0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap85f5abea-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444255, 'tstamp': 444255}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208921, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap85f5abea-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444257, 'tstamp': 444257}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208921, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:06 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:06.001 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85f5abea-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.002 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.005 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:06 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:06.005 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85f5abea-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:28:06 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:06.006 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:28:06 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:06.006 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap85f5abea-a0, col_values=(('external_ids', {'iface-id': 'd3feb028-a3ab-4f43-8a4b-1ee3054fd9f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:28:06 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:06.006 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.088 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.092 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.130 185914 INFO nova.virt.libvirt.driver [-] [instance: c6353280-0641-466d-9963-30eb530755e9] Instance destroyed successfully.
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.131 185914 DEBUG nova.objects.instance [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lazy-loading 'resources' on Instance uuid c6353280-0641-466d-9963-30eb530755e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.148 185914 DEBUG nova.virt.libvirt.vif [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:26:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-92456537',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-92456537',id=8,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:26:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67efa696c46c451ba23d1157e0816503',ramdisk_id='',reservation_id='r-d5o7uffx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2074109192',owner_user_name='tempest-TestExecuteBasicStrategy-2074109192-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:28:01Z,user_data=None,user_id='566db36bffff4193a494fef52f968126',uuid=c6353280-0641-466d-9963-30eb530755e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.149 185914 DEBUG nova.network.os_vif_util [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converting VIF {"id": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "address": "fa:16:3e:44:c3:87", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68d12bd9-0c", "ovs_interfaceid": "68d12bd9-0c21-41b6-b775-1de285c4be2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.150 185914 DEBUG nova.network.os_vif_util [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.151 185914 DEBUG os_vif [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.153 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.154 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68d12bd9-0c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.155 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.157 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.160 185914 INFO os_vif [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:c3:87,bridge_name='br-int',has_traffic_filtering=True,id=68d12bd9-0c21-41b6-b775-1de285c4be2c,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68d12bd9-0c')
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.160 185914 INFO nova.virt.libvirt.driver [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Deleting instance files /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9_del
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.161 185914 INFO nova.virt.libvirt.driver [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Deletion of /var/lib/nova/instances/c6353280-0641-466d-9963-30eb530755e9_del complete
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.219 185914 INFO nova.compute.manager [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.220 185914 DEBUG oslo.service.loopingcall [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.220 185914 DEBUG nova.compute.manager [-] [instance: c6353280-0641-466d-9963-30eb530755e9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:28:06 compute-1 nova_compute[185910]: 2026-02-16 13:28:06.220 185914 DEBUG nova.network.neutron [-] [instance: c6353280-0641-466d-9963-30eb530755e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.358 185914 DEBUG nova.compute.manager [req-94476152-b4e0-40a2-8116-ffb41d4dbb3a req-0813b6e5-e152-49e1-ad80-8f5dd4800160 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.359 185914 DEBUG oslo_concurrency.lockutils [req-94476152-b4e0-40a2-8116-ffb41d4dbb3a req-0813b6e5-e152-49e1-ad80-8f5dd4800160 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.359 185914 DEBUG oslo_concurrency.lockutils [req-94476152-b4e0-40a2-8116-ffb41d4dbb3a req-0813b6e5-e152-49e1-ad80-8f5dd4800160 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.359 185914 DEBUG oslo_concurrency.lockutils [req-94476152-b4e0-40a2-8116-ffb41d4dbb3a req-0813b6e5-e152-49e1-ad80-8f5dd4800160 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.360 185914 DEBUG nova.compute.manager [req-94476152-b4e0-40a2-8116-ffb41d4dbb3a req-0813b6e5-e152-49e1-ad80-8f5dd4800160 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.360 185914 DEBUG nova.compute.manager [req-94476152-b4e0-40a2-8116-ffb41d4dbb3a req-0813b6e5-e152-49e1-ad80-8f5dd4800160 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-unplugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.737 185914 DEBUG nova.network.neutron [-] [instance: c6353280-0641-466d-9963-30eb530755e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.758 185914 INFO nova.compute.manager [-] [instance: c6353280-0641-466d-9963-30eb530755e9] Took 1.54 seconds to deallocate network for instance.
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.816 185914 DEBUG oslo_concurrency.lockutils [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.816 185914 DEBUG oslo_concurrency.lockutils [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.826 185914 DEBUG oslo_concurrency.lockutils [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:07 compute-1 nova_compute[185910]: 2026-02-16 13:28:07.890 185914 INFO nova.scheduler.client.report [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Deleted allocations for instance c6353280-0641-466d-9963-30eb530755e9
Feb 16 13:28:08 compute-1 podman[208939]: 2026-02-16 13:28:08.007968653 +0000 UTC m=+0.141280136 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 13:28:08 compute-1 nova_compute[185910]: 2026-02-16 13:28:08.011 185914 DEBUG oslo_concurrency.lockutils [None req-34244f01-d7bd-43b9-9d94-918ed466e334 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:08 compute-1 nova_compute[185910]: 2026-02-16 13:28:08.234 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:08 compute-1 nova_compute[185910]: 2026-02-16 13:28:08.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.086 185914 DEBUG oslo_concurrency.lockutils [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.086 185914 DEBUG oslo_concurrency.lockutils [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.087 185914 DEBUG oslo_concurrency.lockutils [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.087 185914 DEBUG oslo_concurrency.lockutils [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.087 185914 DEBUG oslo_concurrency.lockutils [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.088 185914 INFO nova.compute.manager [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Terminating instance
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.089 185914 DEBUG nova.compute.manager [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:28:09 compute-1 kernel: tap47879738-fc (unregistering): left promiscuous mode
Feb 16 13:28:09 compute-1 NetworkManager[56388]: <info>  [1771248489.1290] device (tap47879738-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:28:09 compute-1 ovn_controller[96285]: 2026-02-16T13:28:09Z|00075|binding|INFO|Releasing lport 47879738-fc6d-440e-ab06-95bba1de09df from this chassis (sb_readonly=0)
Feb 16 13:28:09 compute-1 ovn_controller[96285]: 2026-02-16T13:28:09Z|00076|binding|INFO|Setting lport 47879738-fc6d-440e-ab06-95bba1de09df down in Southbound
Feb 16 13:28:09 compute-1 ovn_controller[96285]: 2026-02-16T13:28:09Z|00077|binding|INFO|Removing iface tap47879738-fc ovn-installed in OVS
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.131 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.139 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:e3:68 10.100.0.10'], port_security=['fa:16:3e:92:e3:68 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '56c9e87d-b9eb-4307-80a2-8f5bf631c74d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67efa696c46c451ba23d1157e0816503', 'neutron:revision_number': '4', 'neutron:security_group_ids': '12b2bd84-2289-4cae-bae0-edbc0fcc8f32', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2457a351-b1fe-40a0-b007-96d97766c2c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=47879738-fc6d-440e-ab06-95bba1de09df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.139 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.140 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 47879738-fc6d-440e-ab06-95bba1de09df in datapath 85f5abea-ac25-4244-a69b-79e29b2ba1fc unbound from our chassis
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.141 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85f5abea-ac25-4244-a69b-79e29b2ba1fc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.142 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[0530d7d1-b12c-4b42-9652-496c8527f230]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.143 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc namespace which is not needed anymore
Feb 16 13:28:09 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully.
Feb 16 13:28:09 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 17.091s CPU time.
Feb 16 13:28:09 compute-1 systemd-machined[155419]: Machine qemu-5-instance-00000007 terminated.
Feb 16 13:28:09 compute-1 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208345]: [NOTICE]   (208355) : haproxy version is 2.8.14-c23fe91
Feb 16 13:28:09 compute-1 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208345]: [NOTICE]   (208355) : path to executable is /usr/sbin/haproxy
Feb 16 13:28:09 compute-1 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208345]: [WARNING]  (208355) : Exiting Master process...
Feb 16 13:28:09 compute-1 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208345]: [ALERT]    (208355) : Current worker (208357) exited with code 143 (Terminated)
Feb 16 13:28:09 compute-1 neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc[208345]: [WARNING]  (208355) : All workers exited. Exiting... (0)
Feb 16 13:28:09 compute-1 systemd[1]: libpod-6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d.scope: Deactivated successfully.
Feb 16 13:28:09 compute-1 podman[208991]: 2026-02-16 13:28:09.25331239 +0000 UTC m=+0.041904431 container died 6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:28:09 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d-userdata-shm.mount: Deactivated successfully.
Feb 16 13:28:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-eee320d88c8ac82ea860386a3dc5f31038f9b2809b7c6402d8cdcadf0a1703e2-merged.mount: Deactivated successfully.
Feb 16 13:28:09 compute-1 podman[208991]: 2026-02-16 13:28:09.288966572 +0000 UTC m=+0.077558633 container cleanup 6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:28:09 compute-1 systemd[1]: libpod-conmon-6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d.scope: Deactivated successfully.
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.347 185914 INFO nova.virt.libvirt.driver [-] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Instance destroyed successfully.
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.348 185914 DEBUG nova.objects.instance [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lazy-loading 'resources' on Instance uuid 56c9e87d-b9eb-4307-80a2-8f5bf631c74d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:28:09 compute-1 podman[209022]: 2026-02-16 13:28:09.355581072 +0000 UTC m=+0.047624073 container remove 6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.358 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f6faf7b0-0c4a-4bf6-be4f-a79c39d4bcd2]: (4, ('Mon Feb 16 01:28:09 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc (6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d)\n6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d\nMon Feb 16 01:28:09 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc (6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d)\n6d93436395896dd44fc653f819163c4ce8ed5264952d5b2185a72739dc90309d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.360 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e2df560e-d9d0-4f7b-aaf5-60f97767ec5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.361 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85f5abea-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.363 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:09 compute-1 kernel: tap85f5abea-a0: left promiscuous mode
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.366 185914 DEBUG nova.virt.libvirt.vif [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:25:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-645930396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-645930396',id=7,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:25:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67efa696c46c451ba23d1157e0816503',ramdisk_id='',reservation_id='r-tyu7y93f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2074109192',owner_user_name='tempest-TestExecuteBasicStrategy-2074109192-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:25:58Z,user_data=None,user_id='566db36bffff4193a494fef52f968126',uuid=56c9e87d-b9eb-4307-80a2-8f5bf631c74d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.367 185914 DEBUG nova.network.os_vif_util [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converting VIF {"id": "47879738-fc6d-440e-ab06-95bba1de09df", "address": "fa:16:3e:92:e3:68", "network": {"id": "85f5abea-ac25-4244-a69b-79e29b2ba1fc", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-950978049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67efa696c46c451ba23d1157e0816503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47879738-fc", "ovs_interfaceid": "47879738-fc6d-440e-ab06-95bba1de09df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.368 185914 DEBUG nova.network.os_vif_util [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:e3:68,bridge_name='br-int',has_traffic_filtering=True,id=47879738-fc6d-440e-ab06-95bba1de09df,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47879738-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.368 185914 DEBUG os_vif [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:e3:68,bridge_name='br-int',has_traffic_filtering=True,id=47879738-fc6d-440e-ab06-95bba1de09df,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47879738-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.369 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.369 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47879738-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.371 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.372 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.374 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[fc46addd-4200-4957-bf70-1348c052ffe6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.374 185914 INFO os_vif [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:e3:68,bridge_name='br-int',has_traffic_filtering=True,id=47879738-fc6d-440e-ab06-95bba1de09df,network=Network(85f5abea-ac25-4244-a69b-79e29b2ba1fc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47879738-fc')
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.375 185914 INFO nova.virt.libvirt.driver [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Deleting instance files /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d_del
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.375 185914 INFO nova.virt.libvirt.driver [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Deletion of /var/lib/nova/instances/56c9e87d-b9eb-4307-80a2-8f5bf631c74d_del complete
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.390 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ccfeb363-d429-48fa-b7bc-8f396344a7af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.392 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[acd1b023-f468-4866-b712-3f560c2511cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.402 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[28b20686-be3b-43a9-96df-f61360f79a20]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444245, 'reachable_time': 39670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209054, 'error': None, 'target': 'ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:09 compute-1 systemd[1]: run-netns-ovnmeta\x2d85f5abea\x2dac25\x2d4244\x2da69b\x2d79e29b2ba1fc.mount: Deactivated successfully.
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.406 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-85f5abea-ac25-4244-a69b-79e29b2ba1fc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:28:09 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:28:09.406 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[1da92387-940d-4a9c-be18-956f484f1683]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.437 185914 INFO nova.compute.manager [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.438 185914 DEBUG oslo.service.loopingcall [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.438 185914 DEBUG nova.compute.manager [-] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.438 185914 DEBUG nova.network.neutron [-] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.536 185914 DEBUG nova.compute.manager [req-187d8535-8314-4285-8dfe-b1288f27c018 req-e466e5c3-01ff-46fd-ae38-6040ab6a328c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Received event network-vif-unplugged-47879738-fc6d-440e-ab06-95bba1de09df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.537 185914 DEBUG oslo_concurrency.lockutils [req-187d8535-8314-4285-8dfe-b1288f27c018 req-e466e5c3-01ff-46fd-ae38-6040ab6a328c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.537 185914 DEBUG oslo_concurrency.lockutils [req-187d8535-8314-4285-8dfe-b1288f27c018 req-e466e5c3-01ff-46fd-ae38-6040ab6a328c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.537 185914 DEBUG oslo_concurrency.lockutils [req-187d8535-8314-4285-8dfe-b1288f27c018 req-e466e5c3-01ff-46fd-ae38-6040ab6a328c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.537 185914 DEBUG nova.compute.manager [req-187d8535-8314-4285-8dfe-b1288f27c018 req-e466e5c3-01ff-46fd-ae38-6040ab6a328c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] No waiting events found dispatching network-vif-unplugged-47879738-fc6d-440e-ab06-95bba1de09df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.537 185914 DEBUG nova.compute.manager [req-187d8535-8314-4285-8dfe-b1288f27c018 req-e466e5c3-01ff-46fd-ae38-6040ab6a328c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Received event network-vif-unplugged-47879738-fc6d-440e-ab06-95bba1de09df for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.610 185914 DEBUG nova.compute.manager [req-3a4cb3e5-5026-49d7-8ca7-c32b4b965a21 req-d4d00bac-e7d4-4dc6-bb8f-ad3497132bb0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.610 185914 DEBUG oslo_concurrency.lockutils [req-3a4cb3e5-5026-49d7-8ca7-c32b4b965a21 req-d4d00bac-e7d4-4dc6-bb8f-ad3497132bb0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "c6353280-0641-466d-9963-30eb530755e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.611 185914 DEBUG oslo_concurrency.lockutils [req-3a4cb3e5-5026-49d7-8ca7-c32b4b965a21 req-d4d00bac-e7d4-4dc6-bb8f-ad3497132bb0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.611 185914 DEBUG oslo_concurrency.lockutils [req-3a4cb3e5-5026-49d7-8ca7-c32b4b965a21 req-d4d00bac-e7d4-4dc6-bb8f-ad3497132bb0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "c6353280-0641-466d-9963-30eb530755e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.611 185914 DEBUG nova.compute.manager [req-3a4cb3e5-5026-49d7-8ca7-c32b4b965a21 req-d4d00bac-e7d4-4dc6-bb8f-ad3497132bb0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] No waiting events found dispatching network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.611 185914 WARNING nova.compute.manager [req-3a4cb3e5-5026-49d7-8ca7-c32b4b965a21 req-d4d00bac-e7d4-4dc6-bb8f-ad3497132bb0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received unexpected event network-vif-plugged-68d12bd9-0c21-41b6-b775-1de285c4be2c for instance with vm_state deleted and task_state None.
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.611 185914 DEBUG nova.compute.manager [req-3a4cb3e5-5026-49d7-8ca7-c32b4b965a21 req-d4d00bac-e7d4-4dc6-bb8f-ad3497132bb0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: c6353280-0641-466d-9963-30eb530755e9] Received event network-vif-deleted-68d12bd9-0c21-41b6-b775-1de285c4be2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:09 compute-1 nova_compute[185910]: 2026-02-16 13:28:09.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.081 185914 DEBUG nova.network.neutron [-] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.128 185914 INFO nova.compute.manager [-] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Took 0.69 seconds to deallocate network for instance.
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.197 185914 DEBUG oslo_concurrency.lockutils [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.198 185914 DEBUG oslo_concurrency.lockutils [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.265 185914 DEBUG nova.compute.provider_tree [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.304 185914 DEBUG nova.scheduler.client.report [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.350 185914 DEBUG oslo_concurrency.lockutils [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.422 185914 INFO nova.scheduler.client.report [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Deleted allocations for instance 56c9e87d-b9eb-4307-80a2-8f5bf631c74d
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.502 185914 DEBUG oslo_concurrency.lockutils [None req-88443ddb-0017-4c78-9a78-2158b97a98d2 566db36bffff4193a494fef52f968126 67efa696c46c451ba23d1157e0816503 - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.641 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.643 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.669 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.670 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.670 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.670 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.826 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.827 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5845MB free_disk=73.22797775268555GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.827 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.827 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.890 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.891 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.925 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.941 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.966 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:28:10 compute-1 nova_compute[185910]: 2026-02-16 13:28:10.966 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.655 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.659 185914 DEBUG nova.compute.manager [req-df630dd0-956a-400a-bef0-7a9bf9580177 req-96fa019e-b7a0-4694-bfbc-9631dfd5d27f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Received event network-vif-plugged-47879738-fc6d-440e-ab06-95bba1de09df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.659 185914 DEBUG oslo_concurrency.lockutils [req-df630dd0-956a-400a-bef0-7a9bf9580177 req-96fa019e-b7a0-4694-bfbc-9631dfd5d27f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.660 185914 DEBUG oslo_concurrency.lockutils [req-df630dd0-956a-400a-bef0-7a9bf9580177 req-96fa019e-b7a0-4694-bfbc-9631dfd5d27f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.660 185914 DEBUG oslo_concurrency.lockutils [req-df630dd0-956a-400a-bef0-7a9bf9580177 req-96fa019e-b7a0-4694-bfbc-9631dfd5d27f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "56c9e87d-b9eb-4307-80a2-8f5bf631c74d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.660 185914 DEBUG nova.compute.manager [req-df630dd0-956a-400a-bef0-7a9bf9580177 req-96fa019e-b7a0-4694-bfbc-9631dfd5d27f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] No waiting events found dispatching network-vif-plugged-47879738-fc6d-440e-ab06-95bba1de09df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.660 185914 WARNING nova.compute.manager [req-df630dd0-956a-400a-bef0-7a9bf9580177 req-96fa019e-b7a0-4694-bfbc-9631dfd5d27f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Received unexpected event network-vif-plugged-47879738-fc6d-440e-ab06-95bba1de09df for instance with vm_state deleted and task_state None.
Feb 16 13:28:11 compute-1 nova_compute[185910]: 2026-02-16 13:28:11.777 185914 DEBUG nova.compute.manager [req-a63846bc-431b-4265-b73c-ce384e7c647a req-77233031-cd3e-4f72-8448-ed747c50bc8c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Received event network-vif-deleted-47879738-fc6d-440e-ab06-95bba1de09df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:28:12 compute-1 nova_compute[185910]: 2026-02-16 13:28:12.655 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:12 compute-1 nova_compute[185910]: 2026-02-16 13:28:12.656 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:28:12 compute-1 nova_compute[185910]: 2026-02-16 13:28:12.656 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:28:12 compute-1 nova_compute[185910]: 2026-02-16 13:28:12.700 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:28:13 compute-1 nova_compute[185910]: 2026-02-16 13:28:13.236 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:13 compute-1 nova_compute[185910]: 2026-02-16 13:28:13.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:13 compute-1 nova_compute[185910]: 2026-02-16 13:28:13.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:28:14 compute-1 nova_compute[185910]: 2026-02-16 13:28:14.371 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:15 compute-1 podman[209056]: 2026-02-16 13:28:15.922729243 +0000 UTC m=+0.062206243 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:28:17 compute-1 nova_compute[185910]: 2026-02-16 13:28:17.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:28:18 compute-1 nova_compute[185910]: 2026-02-16 13:28:18.237 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:19 compute-1 nova_compute[185910]: 2026-02-16 13:28:19.372 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:19 compute-1 openstack_network_exporter[198096]: ERROR   13:28:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:28:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:28:19 compute-1 openstack_network_exporter[198096]: ERROR   13:28:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:28:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:28:21 compute-1 nova_compute[185910]: 2026-02-16 13:28:21.129 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248486.1288571, c6353280-0641-466d-9963-30eb530755e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:28:21 compute-1 nova_compute[185910]: 2026-02-16 13:28:21.130 185914 INFO nova.compute.manager [-] [instance: c6353280-0641-466d-9963-30eb530755e9] VM Stopped (Lifecycle Event)
Feb 16 13:28:21 compute-1 nova_compute[185910]: 2026-02-16 13:28:21.150 185914 DEBUG nova.compute.manager [None req-c0e68432-25ee-44bf-86e6-01b5ec1970de - - - - - -] [instance: c6353280-0641-466d-9963-30eb530755e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:28:23 compute-1 nova_compute[185910]: 2026-02-16 13:28:23.239 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:24 compute-1 nova_compute[185910]: 2026-02-16 13:28:24.346 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248489.344193, 56c9e87d-b9eb-4307-80a2-8f5bf631c74d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:28:24 compute-1 nova_compute[185910]: 2026-02-16 13:28:24.347 185914 INFO nova.compute.manager [-] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] VM Stopped (Lifecycle Event)
Feb 16 13:28:24 compute-1 nova_compute[185910]: 2026-02-16 13:28:24.370 185914 DEBUG nova.compute.manager [None req-222f5d7a-3abd-49d4-996a-c4b694749426 - - - - - -] [instance: 56c9e87d-b9eb-4307-80a2-8f5bf631c74d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:28:24 compute-1 nova_compute[185910]: 2026-02-16 13:28:24.374 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:27 compute-1 sshd-session[209080]: Invalid user admin from 146.190.226.24 port 34702
Feb 16 13:28:27 compute-1 sshd-session[209080]: Connection closed by invalid user admin 146.190.226.24 port 34702 [preauth]
Feb 16 13:28:28 compute-1 nova_compute[185910]: 2026-02-16 13:28:28.241 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:29 compute-1 nova_compute[185910]: 2026-02-16 13:28:29.376 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:33 compute-1 nova_compute[185910]: 2026-02-16 13:28:33.257 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:33 compute-1 podman[209083]: 2026-02-16 13:28:33.349099934 +0000 UTC m=+0.054935349 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:28:33 compute-1 podman[209082]: 2026-02-16 13:28:33.386218396 +0000 UTC m=+0.094622569 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 16 13:28:34 compute-1 nova_compute[185910]: 2026-02-16 13:28:34.378 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:35 compute-1 podman[195236]: time="2026-02-16T13:28:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:28:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:28:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:28:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:28:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 16 13:28:38 compute-1 nova_compute[185910]: 2026-02-16 13:28:38.285 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:38 compute-1 podman[209121]: 2026-02-16 13:28:38.936960378 +0000 UTC m=+0.080666736 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 16 13:28:39 compute-1 nova_compute[185910]: 2026-02-16 13:28:39.382 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:40 compute-1 ovn_controller[96285]: 2026-02-16T13:28:40Z|00078|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 13:28:43 compute-1 nova_compute[185910]: 2026-02-16 13:28:43.287 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:44 compute-1 nova_compute[185910]: 2026-02-16 13:28:44.385 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:44 compute-1 sshd-session[209147]: Invalid user hadoop from 188.166.42.159 port 43100
Feb 16 13:28:44 compute-1 sshd-session[209147]: Connection closed by invalid user hadoop 188.166.42.159 port 43100 [preauth]
Feb 16 13:28:46 compute-1 podman[209149]: 2026-02-16 13:28:46.908845246 +0000 UTC m=+0.051999840 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:28:48 compute-1 nova_compute[185910]: 2026-02-16 13:28:48.299 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:49 compute-1 openstack_network_exporter[198096]: ERROR   13:28:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:28:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:28:49 compute-1 openstack_network_exporter[198096]: ERROR   13:28:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:28:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:28:49 compute-1 nova_compute[185910]: 2026-02-16 13:28:49.436 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:53 compute-1 nova_compute[185910]: 2026-02-16 13:28:53.300 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:54 compute-1 nova_compute[185910]: 2026-02-16 13:28:54.439 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:58 compute-1 nova_compute[185910]: 2026-02-16 13:28:58.303 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:28:59 compute-1 nova_compute[185910]: 2026-02-16 13:28:59.442 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:03 compute-1 nova_compute[185910]: 2026-02-16 13:29:03.304 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:29:03.336 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:29:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:29:03.336 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:29:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:29:03.336 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:29:03 compute-1 podman[209174]: 2026-02-16 13:29:03.914364513 +0000 UTC m=+0.048870797 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:29:03 compute-1 podman[209173]: 2026-02-16 13:29:03.933182385 +0000 UTC m=+0.072628231 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Feb 16 13:29:04 compute-1 nova_compute[185910]: 2026-02-16 13:29:04.445 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:04 compute-1 nova_compute[185910]: 2026-02-16 13:29:04.651 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:05 compute-1 nova_compute[185910]: 2026-02-16 13:29:05.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:05 compute-1 nova_compute[185910]: 2026-02-16 13:29:05.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:05 compute-1 podman[195236]: time="2026-02-16T13:29:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:29:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:29:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:29:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:29:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 16 13:29:08 compute-1 nova_compute[185910]: 2026-02-16 13:29:08.306 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:08 compute-1 sshd-session[209213]: Invalid user solana from 2.57.122.210 port 49870
Feb 16 13:29:08 compute-1 sshd-session[209213]: Connection closed by invalid user solana 2.57.122.210 port 49870 [preauth]
Feb 16 13:29:09 compute-1 nova_compute[185910]: 2026-02-16 13:29:09.448 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:09 compute-1 nova_compute[185910]: 2026-02-16 13:29:09.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:09 compute-1 nova_compute[185910]: 2026-02-16 13:29:09.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:09 compute-1 podman[209215]: 2026-02-16 13:29:09.934420494 +0000 UTC m=+0.070128445 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.664 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.664 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.665 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.665 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.789 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.791 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5856MB free_disk=73.2276382446289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.791 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.792 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.932 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:29:10 compute-1 nova_compute[185910]: 2026-02-16 13:29:10.932 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:29:11 compute-1 nova_compute[185910]: 2026-02-16 13:29:11.007 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:29:11 compute-1 nova_compute[185910]: 2026-02-16 13:29:11.042 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:29:11 compute-1 nova_compute[185910]: 2026-02-16 13:29:11.043 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:29:11 compute-1 nova_compute[185910]: 2026-02-16 13:29:11.044 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:29:13 compute-1 nova_compute[185910]: 2026-02-16 13:29:13.039 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:13 compute-1 nova_compute[185910]: 2026-02-16 13:29:13.308 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:13 compute-1 nova_compute[185910]: 2026-02-16 13:29:13.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:13 compute-1 nova_compute[185910]: 2026-02-16 13:29:13.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:29:13 compute-1 nova_compute[185910]: 2026-02-16 13:29:13.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:29:13 compute-1 nova_compute[185910]: 2026-02-16 13:29:13.651 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:29:14 compute-1 nova_compute[185910]: 2026-02-16 13:29:14.451 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:14 compute-1 nova_compute[185910]: 2026-02-16 13:29:14.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:14 compute-1 nova_compute[185910]: 2026-02-16 13:29:14.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:29:17 compute-1 nova_compute[185910]: 2026-02-16 13:29:17.206 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:29:17.206 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:29:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:29:17.207 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:29:17 compute-1 nova_compute[185910]: 2026-02-16 13:29:17.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:29:17 compute-1 podman[209241]: 2026-02-16 13:29:17.909931898 +0000 UTC m=+0.054167327 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:29:18 compute-1 nova_compute[185910]: 2026-02-16 13:29:18.309 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:19 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:29:19.209 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:29:19 compute-1 openstack_network_exporter[198096]: ERROR   13:29:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:29:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:29:19 compute-1 openstack_network_exporter[198096]: ERROR   13:29:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:29:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:29:19 compute-1 nova_compute[185910]: 2026-02-16 13:29:19.454 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:20 compute-1 nova_compute[185910]: 2026-02-16 13:29:20.198 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:23 compute-1 nova_compute[185910]: 2026-02-16 13:29:23.311 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:24 compute-1 nova_compute[185910]: 2026-02-16 13:29:24.455 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:28 compute-1 nova_compute[185910]: 2026-02-16 13:29:28.314 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:29 compute-1 nova_compute[185910]: 2026-02-16 13:29:29.458 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:30 compute-1 sshd-session[209266]: Invalid user admin from 146.190.226.24 port 51772
Feb 16 13:29:30 compute-1 sshd-session[209266]: Connection closed by invalid user admin 146.190.226.24 port 51772 [preauth]
Feb 16 13:29:33 compute-1 nova_compute[185910]: 2026-02-16 13:29:33.316 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:34 compute-1 nova_compute[185910]: 2026-02-16 13:29:34.494 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:34 compute-1 podman[209269]: 2026-02-16 13:29:34.905783523 +0000 UTC m=+0.042758113 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:29:34 compute-1 podman[209268]: 2026-02-16 13:29:34.932090026 +0000 UTC m=+0.074196703 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public)
Feb 16 13:29:35 compute-1 podman[195236]: time="2026-02-16T13:29:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:29:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:29:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:29:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 16 13:29:37 compute-1 sshd-session[209310]: Invalid user git from 188.166.42.159 port 46572
Feb 16 13:29:37 compute-1 sshd-session[209310]: Connection closed by invalid user git 188.166.42.159 port 46572 [preauth]
Feb 16 13:29:38 compute-1 nova_compute[185910]: 2026-02-16 13:29:38.319 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:39 compute-1 nova_compute[185910]: 2026-02-16 13:29:39.496 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:40 compute-1 podman[209312]: 2026-02-16 13:29:40.923181874 +0000 UTC m=+0.068365967 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 13:29:43 compute-1 nova_compute[185910]: 2026-02-16 13:29:43.319 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:44 compute-1 nova_compute[185910]: 2026-02-16 13:29:44.497 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:48 compute-1 nova_compute[185910]: 2026-02-16 13:29:48.321 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:48 compute-1 podman[209339]: 2026-02-16 13:29:48.9089665 +0000 UTC m=+0.049523895 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:29:49 compute-1 openstack_network_exporter[198096]: ERROR   13:29:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:29:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:29:49 compute-1 openstack_network_exporter[198096]: ERROR   13:29:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:29:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:29:49 compute-1 nova_compute[185910]: 2026-02-16 13:29:49.500 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:53 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:29:53 compute-1 nova_compute[185910]: 2026-02-16 13:29:53.323 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:54 compute-1 nova_compute[185910]: 2026-02-16 13:29:54.502 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:55 compute-1 ovn_controller[96285]: 2026-02-16T13:29:55Z|00079|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 13:29:58 compute-1 nova_compute[185910]: 2026-02-16 13:29:58.324 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:29:59 compute-1 nova_compute[185910]: 2026-02-16 13:29:59.505 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:03 compute-1 nova_compute[185910]: 2026-02-16 13:30:03.326 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:30:03.336 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:30:03.337 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:30:03.337 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:30:04 compute-1 nova_compute[185910]: 2026-02-16 13:30:04.508 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:04 compute-1 nova_compute[185910]: 2026-02-16 13:30:04.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:05 compute-1 nova_compute[185910]: 2026-02-16 13:30:05.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:05 compute-1 podman[195236]: time="2026-02-16T13:30:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:30:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:30:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:30:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:30:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Feb 16 13:30:05 compute-1 podman[209365]: 2026-02-16 13:30:05.956866505 +0000 UTC m=+0.093674633 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, vendor=Red Hat, Inc., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:30:05 compute-1 podman[209366]: 2026-02-16 13:30:05.975183442 +0000 UTC m=+0.106807539 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 13:30:06 compute-1 nova_compute[185910]: 2026-02-16 13:30:06.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:08 compute-1 nova_compute[185910]: 2026-02-16 13:30:08.355 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:09 compute-1 nova_compute[185910]: 2026-02-16 13:30:09.515 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:09 compute-1 nova_compute[185910]: 2026-02-16 13:30:09.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:10 compute-1 nova_compute[185910]: 2026-02-16 13:30:10.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:11 compute-1 podman[209407]: 2026-02-16 13:30:11.952798033 +0000 UTC m=+0.089208291 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.627 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.662 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.662 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.662 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.663 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.798 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.799 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5868MB free_disk=73.22370529174805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.799 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.800 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.882 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.883 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.908 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.928 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.930 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:30:12 compute-1 nova_compute[185910]: 2026-02-16 13:30:12.930 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:30:13 compute-1 nova_compute[185910]: 2026-02-16 13:30:13.357 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:14 compute-1 nova_compute[185910]: 2026-02-16 13:30:14.518 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:14 compute-1 nova_compute[185910]: 2026-02-16 13:30:14.931 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:14 compute-1 nova_compute[185910]: 2026-02-16 13:30:14.932 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:30:15 compute-1 nova_compute[185910]: 2026-02-16 13:30:15.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:30:15 compute-1 nova_compute[185910]: 2026-02-16 13:30:15.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:30:15 compute-1 nova_compute[185910]: 2026-02-16 13:30:15.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:30:15 compute-1 nova_compute[185910]: 2026-02-16 13:30:15.649 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:30:17 compute-1 nova_compute[185910]: 2026-02-16 13:30:17.914 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:30:17.915 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:30:17 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:30:17.917 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:30:18 compute-1 nova_compute[185910]: 2026-02-16 13:30:18.361 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:30:18.920 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:30:19 compute-1 openstack_network_exporter[198096]: ERROR   13:30:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:30:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:30:19 compute-1 openstack_network_exporter[198096]: ERROR   13:30:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:30:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:30:19 compute-1 nova_compute[185910]: 2026-02-16 13:30:19.519 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:19 compute-1 podman[209433]: 2026-02-16 13:30:19.948078314 +0000 UTC m=+0.084109943 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:30:23 compute-1 nova_compute[185910]: 2026-02-16 13:30:23.362 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:24 compute-1 nova_compute[185910]: 2026-02-16 13:30:24.523 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:28 compute-1 nova_compute[185910]: 2026-02-16 13:30:28.364 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:29 compute-1 nova_compute[185910]: 2026-02-16 13:30:29.525 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:32 compute-1 sshd-session[209457]: Invalid user admin from 146.190.226.24 port 41108
Feb 16 13:30:32 compute-1 sshd-session[209457]: Connection closed by invalid user admin 146.190.226.24 port 41108 [preauth]
Feb 16 13:30:33 compute-1 nova_compute[185910]: 2026-02-16 13:30:33.366 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:34 compute-1 nova_compute[185910]: 2026-02-16 13:30:34.528 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:35 compute-1 sshd-session[209459]: Invalid user deploy from 188.166.42.159 port 35574
Feb 16 13:30:35 compute-1 sshd-session[209459]: Connection closed by invalid user deploy 188.166.42.159 port 35574 [preauth]
Feb 16 13:30:35 compute-1 podman[195236]: time="2026-02-16T13:30:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:30:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:30:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:30:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:30:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Feb 16 13:30:36 compute-1 podman[209461]: 2026-02-16 13:30:36.915568526 +0000 UTC m=+0.055799985 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, architecture=x86_64, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.7, managed_by=edpm_ansible)
Feb 16 13:30:36 compute-1 podman[209462]: 2026-02-16 13:30:36.939891706 +0000 UTC m=+0.075965212 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:30:38 compute-1 nova_compute[185910]: 2026-02-16 13:30:38.369 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:39 compute-1 nova_compute[185910]: 2026-02-16 13:30:39.531 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:42 compute-1 podman[209498]: 2026-02-16 13:30:42.925127934 +0000 UTC m=+0.071698877 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:30:43 compute-1 nova_compute[185910]: 2026-02-16 13:30:43.371 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:44 compute-1 nova_compute[185910]: 2026-02-16 13:30:44.533 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:48 compute-1 nova_compute[185910]: 2026-02-16 13:30:48.373 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:49 compute-1 openstack_network_exporter[198096]: ERROR   13:30:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:30:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:30:49 compute-1 openstack_network_exporter[198096]: ERROR   13:30:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:30:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:30:49 compute-1 nova_compute[185910]: 2026-02-16 13:30:49.534 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:50 compute-1 podman[209524]: 2026-02-16 13:30:50.898922211 +0000 UTC m=+0.045318890 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:30:53 compute-1 nova_compute[185910]: 2026-02-16 13:30:53.374 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:54 compute-1 nova_compute[185910]: 2026-02-16 13:30:54.537 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:58 compute-1 nova_compute[185910]: 2026-02-16 13:30:58.377 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.230 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.231 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.248 185914 DEBUG nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.352 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.353 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.363 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.364 185914 INFO nova.compute.claims [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.534 185914 DEBUG nova.compute.provider_tree [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.539 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.554 185914 DEBUG nova.scheduler.client.report [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.588 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.589 185914 DEBUG nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.631 185914 DEBUG nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.632 185914 DEBUG nova.network.neutron [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.654 185914 INFO nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.670 185914 DEBUG nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.753 185914 DEBUG nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.754 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.755 185914 INFO nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Creating image(s)
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.755 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.756 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.757 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.774 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.819 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.820 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.821 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.832 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.875 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.876 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.905 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.906 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.906 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.947 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.948 185914 DEBUG nova.virt.disk.api [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Checking if we can resize image /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.949 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.996 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.997 185914 DEBUG nova.virt.disk.api [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Cannot resize image /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:30:59 compute-1 nova_compute[185910]: 2026-02-16 13:30:59.997 185914 DEBUG nova.objects.instance [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lazy-loading 'migration_context' on Instance uuid 07689e3f-f214-4f57-a662-bc531b614c3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:31:00 compute-1 nova_compute[185910]: 2026-02-16 13:31:00.012 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:31:00 compute-1 nova_compute[185910]: 2026-02-16 13:31:00.013 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Ensure instance console log exists: /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:31:00 compute-1 nova_compute[185910]: 2026-02-16 13:31:00.013 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:00 compute-1 nova_compute[185910]: 2026-02-16 13:31:00.013 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:00 compute-1 nova_compute[185910]: 2026-02-16 13:31:00.014 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:01 compute-1 nova_compute[185910]: 2026-02-16 13:31:01.850 185914 DEBUG nova.policy [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8712c0037def471dabf14879c0a418ec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9d212b8e966a499a9aad9b972bb7e76d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:31:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:03.337 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:03.338 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:03.338 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:03 compute-1 nova_compute[185910]: 2026-02-16 13:31:03.381 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:04 compute-1 nova_compute[185910]: 2026-02-16 13:31:04.542 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:05 compute-1 nova_compute[185910]: 2026-02-16 13:31:05.025 185914 DEBUG nova.network.neutron [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Successfully created port: 9ac0912f-d593-4dad-bf05-01d7dd0b6677 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:31:05 compute-1 podman[195236]: time="2026-02-16T13:31:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:31:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:31:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:31:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:31:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Feb 16 13:31:06 compute-1 nova_compute[185910]: 2026-02-16 13:31:06.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:06 compute-1 nova_compute[185910]: 2026-02-16 13:31:06.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:07 compute-1 nova_compute[185910]: 2026-02-16 13:31:07.206 185914 DEBUG nova.network.neutron [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Successfully updated port: 9ac0912f-d593-4dad-bf05-01d7dd0b6677 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:31:07 compute-1 nova_compute[185910]: 2026-02-16 13:31:07.229 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:31:07 compute-1 nova_compute[185910]: 2026-02-16 13:31:07.229 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquired lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:31:07 compute-1 nova_compute[185910]: 2026-02-16 13:31:07.230 185914 DEBUG nova.network.neutron [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:31:07 compute-1 nova_compute[185910]: 2026-02-16 13:31:07.318 185914 DEBUG nova.compute.manager [req-24b7ecd8-cb8f-48b4-b4e1-4f8c77d8e823 req-de327354-692b-4efa-85fa-2c44499398d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-changed-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:31:07 compute-1 nova_compute[185910]: 2026-02-16 13:31:07.319 185914 DEBUG nova.compute.manager [req-24b7ecd8-cb8f-48b4-b4e1-4f8c77d8e823 req-de327354-692b-4efa-85fa-2c44499398d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Refreshing instance network info cache due to event network-changed-9ac0912f-d593-4dad-bf05-01d7dd0b6677. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:31:07 compute-1 nova_compute[185910]: 2026-02-16 13:31:07.319 185914 DEBUG oslo_concurrency.lockutils [req-24b7ecd8-cb8f-48b4-b4e1-4f8c77d8e823 req-de327354-692b-4efa-85fa-2c44499398d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:31:07 compute-1 nova_compute[185910]: 2026-02-16 13:31:07.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:07 compute-1 podman[209563]: 2026-02-16 13:31:07.90868658 +0000 UTC m=+0.052569837 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.buildah.version=1.33.7)
Feb 16 13:31:07 compute-1 podman[209564]: 2026-02-16 13:31:07.912810352 +0000 UTC m=+0.052206787 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:31:08 compute-1 nova_compute[185910]: 2026-02-16 13:31:08.072 185914 DEBUG nova.network.neutron [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:31:08 compute-1 nova_compute[185910]: 2026-02-16 13:31:08.382 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.429 185914 DEBUG nova.network.neutron [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updating instance_info_cache with network_info: [{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.456 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Releasing lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.457 185914 DEBUG nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Instance network_info: |[{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.458 185914 DEBUG oslo_concurrency.lockutils [req-24b7ecd8-cb8f-48b4-b4e1-4f8c77d8e823 req-de327354-692b-4efa-85fa-2c44499398d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.459 185914 DEBUG nova.network.neutron [req-24b7ecd8-cb8f-48b4-b4e1-4f8c77d8e823 req-de327354-692b-4efa-85fa-2c44499398d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Refreshing network info cache for port 9ac0912f-d593-4dad-bf05-01d7dd0b6677 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.464 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Start _get_guest_xml network_info=[{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.470 185914 WARNING nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.486 185914 DEBUG nova.virt.libvirt.host [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.487 185914 DEBUG nova.virt.libvirt.host [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.491 185914 DEBUG nova.virt.libvirt.host [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.492 185914 DEBUG nova.virt.libvirt.host [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.494 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.494 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.495 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.495 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.496 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.496 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.497 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.497 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.498 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.499 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.499 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.499 185914 DEBUG nova.virt.hardware [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.505 185914 DEBUG nova.virt.libvirt.vif [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:30:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1190069071',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1190069071',id=9,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d212b8e966a499a9aad9b972bb7e76d',ramdisk_id='',reservation_id='r-wzp5jrw0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-464275700',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:30:59Z,user_data=None,user_id='8712c0037def471dabf14879c0a418ec',uuid=07689e3f-f214-4f57-a662-bc531b614c3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.506 185914 DEBUG nova.network.os_vif_util [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converting VIF {"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.508 185914 DEBUG nova.network.os_vif_util [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.509 185914 DEBUG nova.objects.instance [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lazy-loading 'pci_devices' on Instance uuid 07689e3f-f214-4f57-a662-bc531b614c3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.531 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <uuid>07689e3f-f214-4f57-a662-bc531b614c3d</uuid>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <name>instance-00000009</name>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1190069071</nova:name>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:31:09</nova:creationTime>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:31:09 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:31:09 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:31:09 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:31:09 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:31:09 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:31:09 compute-1 nova_compute[185910]:         <nova:user uuid="8712c0037def471dabf14879c0a418ec">tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member</nova:user>
Feb 16 13:31:09 compute-1 nova_compute[185910]:         <nova:project uuid="9d212b8e966a499a9aad9b972bb7e76d">tempest-TestExecuteHostMaintenanceStrategy-464275700</nova:project>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:31:09 compute-1 nova_compute[185910]:         <nova:port uuid="9ac0912f-d593-4dad-bf05-01d7dd0b6677">
Feb 16 13:31:09 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <system>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <entry name="serial">07689e3f-f214-4f57-a662-bc531b614c3d</entry>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <entry name="uuid">07689e3f-f214-4f57-a662-bc531b614c3d</entry>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     </system>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <os>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   </os>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <features>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   </features>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:ba:1f:94"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <target dev="tap9ac0912f-d5"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/console.log" append="off"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <video>
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     </video>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:31:09 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:31:09 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:31:09 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:31:09 compute-1 nova_compute[185910]: </domain>
Feb 16 13:31:09 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.532 185914 DEBUG nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Preparing to wait for external event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.532 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.533 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.533 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.534 185914 DEBUG nova.virt.libvirt.vif [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:30:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1190069071',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1190069071',id=9,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d212b8e966a499a9aad9b972bb7e76d',ramdisk_id='',reservation_id='r-wzp5jrw0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-464275700',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:30:59Z,user_data=None,user_id='8712c0037def471dabf14879c0a418ec',uuid=07689e3f-f214-4f57-a662-bc531b614c3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.534 185914 DEBUG nova.network.os_vif_util [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converting VIF {"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.534 185914 DEBUG nova.network.os_vif_util [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.535 185914 DEBUG os_vif [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.535 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.536 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.536 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.540 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.540 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ac0912f-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.541 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9ac0912f-d5, col_values=(('external_ids', {'iface-id': '9ac0912f-d593-4dad-bf05-01d7dd0b6677', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:1f:94', 'vm-uuid': '07689e3f-f214-4f57-a662-bc531b614c3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.542 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:09 compute-1 NetworkManager[56388]: <info>  [1771248669.5437] manager: (tap9ac0912f-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.545 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.549 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.550 185914 INFO os_vif [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5')
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.601 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.601 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.601 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] No VIF found with MAC fa:16:3e:ba:1f:94, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:31:09 compute-1 nova_compute[185910]: 2026-02-16 13:31:09.602 185914 INFO nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Using config drive
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.005 185914 INFO nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Creating config drive at /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.008 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7dnuhe71 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.128 185914 DEBUG oslo_concurrency.processutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7dnuhe71" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:31:10 compute-1 kernel: tap9ac0912f-d5: entered promiscuous mode
Feb 16 13:31:10 compute-1 ovn_controller[96285]: 2026-02-16T13:31:10Z|00080|binding|INFO|Claiming lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 for this chassis.
Feb 16 13:31:10 compute-1 ovn_controller[96285]: 2026-02-16T13:31:10Z|00081|binding|INFO|9ac0912f-d593-4dad-bf05-01d7dd0b6677: Claiming fa:16:3e:ba:1f:94 10.100.0.14
Feb 16 13:31:10 compute-1 NetworkManager[56388]: <info>  [1771248670.1887] manager: (tap9ac0912f-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.188 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.205 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:1f:94 10.100.0.14'], port_security=['fa:16:3e:ba:1f:94 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '07689e3f-f214-4f57-a662-bc531b614c3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34e10b77-8ec0-4af1-a031-d83792585eee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d212b8e966a499a9aad9b972bb7e76d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '145107b4-bbb8-4e69-b3bf-db62f38a1f3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1fdee5c0-c83c-45cf-986e-fa2b109e36c1, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=9ac0912f-d593-4dad-bf05-01d7dd0b6677) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.207 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 9ac0912f-d593-4dad-bf05-01d7dd0b6677 in datapath 34e10b77-8ec0-4af1-a031-d83792585eee bound to our chassis
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.208 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 34e10b77-8ec0-4af1-a031-d83792585eee
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.207 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.211 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:10 compute-1 systemd-udevd[209622]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:31:10 compute-1 ovn_controller[96285]: 2026-02-16T13:31:10Z|00082|binding|INFO|Setting lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 ovn-installed in OVS
Feb 16 13:31:10 compute-1 ovn_controller[96285]: 2026-02-16T13:31:10Z|00083|binding|INFO|Setting lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 up in Southbound
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.216 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:10 compute-1 systemd-machined[155419]: New machine qemu-7-instance-00000009.
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.221 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8a4d1c60-512e-4bea-b7df-410734383c16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.223 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap34e10b77-81 in ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.225 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap34e10b77-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.225 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1f35a328-1708-4295-841d-c8ff5af997a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.226 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[91c13f0c-bf16-408a-80ea-ab829901ecc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 NetworkManager[56388]: <info>  [1771248670.2284] device (tap9ac0912f-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:31:10 compute-1 NetworkManager[56388]: <info>  [1771248670.2291] device (tap9ac0912f-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:31:10 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-00000009.
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.236 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[82b761f3-f5da-400e-ac10-e20221c0cb73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.257 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8f54c75e-57d5-46c6-a0f6-820ee45e10b6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.280 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[24d1fcd7-9947-4a44-bfc3-2573a19b3270]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 systemd-udevd[209626]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.286 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[89ec3060-3107-4d79-aab2-de620e521778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 NetworkManager[56388]: <info>  [1771248670.2882] manager: (tap34e10b77-80): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.309 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[8f185fa6-e29c-4fb0-a5f6-680123f3d377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.312 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[6034ba11-a301-4eec-9218-e5c50200ff38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 NetworkManager[56388]: <info>  [1771248670.3301] device (tap34e10b77-80): carrier: link connected
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.332 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[0adda8fb-327a-4e24-8bcb-698ef4a3f498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.345 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[876afb84-c059-43e2-a121-0da4e62e028d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34e10b77-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:31:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475475, 'reachable_time': 39098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209656, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.356 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[970887f0-a31e-4915-beea-47976f6ceb0a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:3113'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475475, 'tstamp': 475475}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209657, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.370 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[dc64f702-c8c1-4fa2-96a7-1e2bc8d01c5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap34e10b77-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:31:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475475, 'reachable_time': 39098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209658, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.387 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e4fef1d6-1b18-4164-91fc-3b895ccd7d69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.422 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[074bb1cb-2bbc-4c26-ab85-e70bcbb93b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.423 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34e10b77-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.424 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.424 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap34e10b77-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:10 compute-1 kernel: tap34e10b77-80: entered promiscuous mode
Feb 16 13:31:10 compute-1 NetworkManager[56388]: <info>  [1771248670.4270] manager: (tap34e10b77-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.426 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.429 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap34e10b77-80, col_values=(('external_ids', {'iface-id': '37eb0121-3449-47dc-8fd8-69d7f9268b6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:10 compute-1 ovn_controller[96285]: 2026-02-16T13:31:10Z|00084|binding|INFO|Releasing lport 37eb0121-3449-47dc-8fd8-69d7f9268b6f from this chassis (sb_readonly=0)
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.430 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/34e10b77-8ec0-4af1-a031-d83792585eee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/34e10b77-8ec0-4af1-a031-d83792585eee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.431 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[83773f7c-6dac-4673-888b-99e4b30e4f92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.432 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-34e10b77-8ec0-4af1-a031-d83792585eee
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/34e10b77-8ec0-4af1-a031-d83792585eee.pid.haproxy
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 34e10b77-8ec0-4af1-a031-d83792585eee
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:31:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:10.433 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'env', 'PROCESS_TAG=haproxy-34e10b77-8ec0-4af1-a031-d83792585eee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/34e10b77-8ec0-4af1-a031-d83792585eee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.435 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.535 185914 DEBUG nova.compute.manager [req-b01c23e1-ba48-40ab-9d84-16cb6eed4f5c req-89585521-628c-40f4-916f-a9b7252160b0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.536 185914 DEBUG oslo_concurrency.lockutils [req-b01c23e1-ba48-40ab-9d84-16cb6eed4f5c req-89585521-628c-40f4-916f-a9b7252160b0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.537 185914 DEBUG oslo_concurrency.lockutils [req-b01c23e1-ba48-40ab-9d84-16cb6eed4f5c req-89585521-628c-40f4-916f-a9b7252160b0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.537 185914 DEBUG oslo_concurrency.lockutils [req-b01c23e1-ba48-40ab-9d84-16cb6eed4f5c req-89585521-628c-40f4-916f-a9b7252160b0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.537 185914 DEBUG nova.compute.manager [req-b01c23e1-ba48-40ab-9d84-16cb6eed4f5c req-89585521-628c-40f4-916f-a9b7252160b0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Processing event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:10 compute-1 podman[209690]: 2026-02-16 13:31:10.790889369 +0000 UTC m=+0.046543574 container create a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 16 13:31:10 compute-1 systemd[1]: Started libpod-conmon-a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0.scope.
Feb 16 13:31:10 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.846 185914 DEBUG nova.network.neutron [req-24b7ecd8-cb8f-48b4-b4e1-4f8c77d8e823 req-de327354-692b-4efa-85fa-2c44499398d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updated VIF entry in instance network info cache for port 9ac0912f-d593-4dad-bf05-01d7dd0b6677. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.848 185914 DEBUG nova.network.neutron [req-24b7ecd8-cb8f-48b4-b4e1-4f8c77d8e823 req-de327354-692b-4efa-85fa-2c44499398d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updating instance_info_cache with network_info: [{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:31:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd6a1d25bc523b1dc5f2bf65c1750479e5ad75ba6bd1e46312f9fdc914afb4bf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:31:10 compute-1 podman[209690]: 2026-02-16 13:31:10.765140011 +0000 UTC m=+0.020794246 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:31:10 compute-1 podman[209690]: 2026-02-16 13:31:10.864358523 +0000 UTC m=+0.120012798 container init a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 16 13:31:10 compute-1 podman[209690]: 2026-02-16 13:31:10.868967168 +0000 UTC m=+0.124621373 container start a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.885 185914 DEBUG oslo_concurrency.lockutils [req-24b7ecd8-cb8f-48b4-b4e1-4f8c77d8e823 req-de327354-692b-4efa-85fa-2c44499398d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:31:10 compute-1 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209706]: [NOTICE]   (209710) : New worker (209713) forked
Feb 16 13:31:10 compute-1 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209706]: [NOTICE]   (209710) : Loading success.
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.996 185914 DEBUG nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.997 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248670.995445, 07689e3f-f214-4f57-a662-bc531b614c3d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:31:10 compute-1 nova_compute[185910]: 2026-02-16 13:31:10.999 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] VM Started (Lifecycle Event)
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.002 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.006 185914 INFO nova.virt.libvirt.driver [-] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Instance spawned successfully.
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.007 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.027 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.034 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.037 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.037 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.037 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.038 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.038 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.039 185914 DEBUG nova.virt.libvirt.driver [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.062 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.063 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248670.9957314, 07689e3f-f214-4f57-a662-bc531b614c3d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.063 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] VM Paused (Lifecycle Event)
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.089 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.094 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248670.9994824, 07689e3f-f214-4f57-a662-bc531b614c3d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.094 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] VM Resumed (Lifecycle Event)
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.098 185914 INFO nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Took 11.34 seconds to spawn the instance on the hypervisor.
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.098 185914 DEBUG nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.125 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.129 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.172 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.179 185914 INFO nova.compute.manager [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Took 11.86 seconds to build instance.
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.198 185914 DEBUG oslo_concurrency.lockutils [None req-73050104-6d6d-4c9f-8d97-822afdea53a7 8712c0037def471dabf14879c0a418ec 9d212b8e966a499a9aad9b972bb7e76d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.967s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:11 compute-1 nova_compute[185910]: 2026-02-16 13:31:11.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:12 compute-1 nova_compute[185910]: 2026-02-16 13:31:12.628 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:12 compute-1 nova_compute[185910]: 2026-02-16 13:31:12.635 185914 DEBUG nova.compute.manager [req-0fa03c4f-1aa7-43ce-866d-797e4135d622 req-bd146673-1dda-43a9-88dc-22046c5f85d6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:31:12 compute-1 nova_compute[185910]: 2026-02-16 13:31:12.636 185914 DEBUG oslo_concurrency.lockutils [req-0fa03c4f-1aa7-43ce-866d-797e4135d622 req-bd146673-1dda-43a9-88dc-22046c5f85d6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:12 compute-1 nova_compute[185910]: 2026-02-16 13:31:12.636 185914 DEBUG oslo_concurrency.lockutils [req-0fa03c4f-1aa7-43ce-866d-797e4135d622 req-bd146673-1dda-43a9-88dc-22046c5f85d6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:12 compute-1 nova_compute[185910]: 2026-02-16 13:31:12.637 185914 DEBUG oslo_concurrency.lockutils [req-0fa03c4f-1aa7-43ce-866d-797e4135d622 req-bd146673-1dda-43a9-88dc-22046c5f85d6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:12 compute-1 nova_compute[185910]: 2026-02-16 13:31:12.637 185914 DEBUG nova.compute.manager [req-0fa03c4f-1aa7-43ce-866d-797e4135d622 req-bd146673-1dda-43a9-88dc-22046c5f85d6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:31:12 compute-1 nova_compute[185910]: 2026-02-16 13:31:12.637 185914 WARNING nova.compute.manager [req-0fa03c4f-1aa7-43ce-866d-797e4135d622 req-bd146673-1dda-43a9-88dc-22046c5f85d6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received unexpected event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with vm_state active and task_state None.
Feb 16 13:31:13 compute-1 nova_compute[185910]: 2026-02-16 13:31:13.384 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:13 compute-1 podman[209728]: 2026-02-16 13:31:13.969928162 +0000 UTC m=+0.101399883 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.544 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.660 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.661 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.661 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.662 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.730 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.777 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.778 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.824 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.962 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.964 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5668MB free_disk=73.22283554077148GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.964 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:31:14 compute-1 nova_compute[185910]: 2026-02-16 13:31:14.964 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.040 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 07689e3f-f214-4f57-a662-bc531b614c3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.040 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.040 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.056 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.074 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.075 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.092 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.114 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.172 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.189 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.216 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:31:15 compute-1 nova_compute[185910]: 2026-02-16 13:31:15.216 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:31:16 compute-1 nova_compute[185910]: 2026-02-16 13:31:16.217 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:16 compute-1 nova_compute[185910]: 2026-02-16 13:31:16.218 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:31:16 compute-1 nova_compute[185910]: 2026-02-16 13:31:16.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:16 compute-1 nova_compute[185910]: 2026-02-16 13:31:16.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:31:16 compute-1 nova_compute[185910]: 2026-02-16 13:31:16.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:31:17 compute-1 nova_compute[185910]: 2026-02-16 13:31:17.281 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:31:17 compute-1 nova_compute[185910]: 2026-02-16 13:31:17.282 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:31:17 compute-1 nova_compute[185910]: 2026-02-16 13:31:17.282 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:31:17 compute-1 nova_compute[185910]: 2026-02-16 13:31:17.282 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 07689e3f-f214-4f57-a662-bc531b614c3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:31:18 compute-1 nova_compute[185910]: 2026-02-16 13:31:18.388 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:19 compute-1 openstack_network_exporter[198096]: ERROR   13:31:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:31:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:31:19 compute-1 openstack_network_exporter[198096]: ERROR   13:31:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:31:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:31:19 compute-1 nova_compute[185910]: 2026-02-16 13:31:19.546 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:19 compute-1 nova_compute[185910]: 2026-02-16 13:31:19.767 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updating instance_info_cache with network_info: [{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:31:19 compute-1 nova_compute[185910]: 2026-02-16 13:31:19.793 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:31:19 compute-1 nova_compute[185910]: 2026-02-16 13:31:19.793 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:31:21 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:21.840 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:31:21 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:21.842 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:31:21 compute-1 nova_compute[185910]: 2026-02-16 13:31:21.873 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:21 compute-1 podman[209765]: 2026-02-16 13:31:21.936871674 +0000 UTC m=+0.048559599 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:31:23 compute-1 nova_compute[185910]: 2026-02-16 13:31:23.391 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:23 compute-1 nova_compute[185910]: 2026-02-16 13:31:23.787 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:31:24 compute-1 ovn_controller[96285]: 2026-02-16T13:31:24Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ba:1f:94 10.100.0.14
Feb 16 13:31:24 compute-1 ovn_controller[96285]: 2026-02-16T13:31:24Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:1f:94 10.100.0.14
Feb 16 13:31:24 compute-1 nova_compute[185910]: 2026-02-16 13:31:24.550 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:25 compute-1 sshd-session[209799]: banner exchange: Connection from 218.52.254.90 port 39928: invalid format
Feb 16 13:31:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:31:26.844 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:31:28 compute-1 nova_compute[185910]: 2026-02-16 13:31:28.394 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:28 compute-1 sshd-session[209800]: Invalid user ubuntu from 2.57.122.210 port 52598
Feb 16 13:31:28 compute-1 sshd-session[209800]: Connection closed by invalid user ubuntu 2.57.122.210 port 52598 [preauth]
Feb 16 13:31:29 compute-1 nova_compute[185910]: 2026-02-16 13:31:29.553 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:30 compute-1 sshd-session[209802]: Invalid user test from 188.166.42.159 port 49320
Feb 16 13:31:31 compute-1 sshd-session[209802]: Connection closed by invalid user test 188.166.42.159 port 49320 [preauth]
Feb 16 13:31:33 compute-1 nova_compute[185910]: 2026-02-16 13:31:33.396 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:34 compute-1 nova_compute[185910]: 2026-02-16 13:31:34.556 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:35 compute-1 podman[195236]: time="2026-02-16T13:31:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:31:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:31:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:31:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:31:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Feb 16 13:31:37 compute-1 sshd-session[209804]: Received disconnect from 218.52.254.90 port 42298:11: Bye Bye [preauth]
Feb 16 13:31:37 compute-1 sshd-session[209804]: Disconnected from 218.52.254.90 port 42298 [preauth]
Feb 16 13:31:38 compute-1 nova_compute[185910]: 2026-02-16 13:31:38.397 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:38 compute-1 podman[209809]: 2026-02-16 13:31:38.93246347 +0000 UTC m=+0.065814076 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:31:38 compute-1 podman[209808]: 2026-02-16 13:31:38.94683913 +0000 UTC m=+0.073188926 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, version=9.7, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:31:39 compute-1 sshd-session[209806]: Invalid user admin from 146.190.226.24 port 60388
Feb 16 13:31:39 compute-1 sshd-session[209806]: Connection closed by invalid user admin 146.190.226.24 port 60388 [preauth]
Feb 16 13:31:39 compute-1 nova_compute[185910]: 2026-02-16 13:31:39.559 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:43 compute-1 nova_compute[185910]: 2026-02-16 13:31:43.399 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:44 compute-1 nova_compute[185910]: 2026-02-16 13:31:44.562 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:44 compute-1 podman[209850]: 2026-02-16 13:31:44.938796512 +0000 UTC m=+0.077246747 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 13:31:48 compute-1 nova_compute[185910]: 2026-02-16 13:31:48.402 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:49 compute-1 openstack_network_exporter[198096]: ERROR   13:31:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:31:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:31:49 compute-1 openstack_network_exporter[198096]: ERROR   13:31:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:31:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:31:49 compute-1 nova_compute[185910]: 2026-02-16 13:31:49.564 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:51 compute-1 ovn_controller[96285]: 2026-02-16T13:31:51Z|00085|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 16 13:31:52 compute-1 podman[209876]: 2026-02-16 13:31:52.913781253 +0000 UTC m=+0.054445778 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:31:53 compute-1 nova_compute[185910]: 2026-02-16 13:31:53.403 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:54 compute-1 nova_compute[185910]: 2026-02-16 13:31:54.567 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:58 compute-1 nova_compute[185910]: 2026-02-16 13:31:58.404 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:31:59 compute-1 nova_compute[185910]: 2026-02-16 13:31:59.570 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:01 compute-1 anacron[60676]: Job `cron.daily' started
Feb 16 13:32:01 compute-1 anacron[60676]: Job `cron.daily' terminated
Feb 16 13:32:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:03.338 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:03.339 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:03.339 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:03 compute-1 nova_compute[185910]: 2026-02-16 13:32:03.406 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:04 compute-1 nova_compute[185910]: 2026-02-16 13:32:04.572 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:05 compute-1 podman[195236]: time="2026-02-16T13:32:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:32:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:32:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:32:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:32:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 16 13:32:07 compute-1 nova_compute[185910]: 2026-02-16 13:32:07.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:08 compute-1 nova_compute[185910]: 2026-02-16 13:32:08.408 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:08 compute-1 nova_compute[185910]: 2026-02-16 13:32:08.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:08 compute-1 nova_compute[185910]: 2026-02-16 13:32:08.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:09 compute-1 nova_compute[185910]: 2026-02-16 13:32:09.575 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:09 compute-1 podman[209907]: 2026-02-16 13:32:09.922227762 +0000 UTC m=+0.058236051 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:32:09 compute-1 podman[209906]: 2026-02-16 13:32:09.948559321 +0000 UTC m=+0.090270496 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1770267347, architecture=x86_64, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter)
Feb 16 13:32:10 compute-1 nova_compute[185910]: 2026-02-16 13:32:10.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:12 compute-1 nova_compute[185910]: 2026-02-16 13:32:12.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:13 compute-1 nova_compute[185910]: 2026-02-16 13:32:13.410 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:14 compute-1 nova_compute[185910]: 2026-02-16 13:32:14.578 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:14 compute-1 nova_compute[185910]: 2026-02-16 13:32:14.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:15 compute-1 nova_compute[185910]: 2026-02-16 13:32:15.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:15 compute-1 nova_compute[185910]: 2026-02-16 13:32:15.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:32:15 compute-1 podman[209947]: 2026-02-16 13:32:15.945752326 +0000 UTC m=+0.082248287 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 16 13:32:16 compute-1 nova_compute[185910]: 2026-02-16 13:32:16.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:16 compute-1 nova_compute[185910]: 2026-02-16 13:32:16.715 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:16 compute-1 nova_compute[185910]: 2026-02-16 13:32:16.716 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:16 compute-1 nova_compute[185910]: 2026-02-16 13:32:16.716 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:16 compute-1 nova_compute[185910]: 2026-02-16 13:32:16.716 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:32:16 compute-1 nova_compute[185910]: 2026-02-16 13:32:16.799 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:16 compute-1 nova_compute[185910]: 2026-02-16 13:32:16.880 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:16 compute-1 nova_compute[185910]: 2026-02-16 13:32:16.882 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:16 compute-1 nova_compute[185910]: 2026-02-16 13:32:16.933 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.082 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.084 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5697MB free_disk=73.19478607177734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.085 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.085 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.386 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 07689e3f-f214-4f57-a662-bc531b614c3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.387 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.387 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.428 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.451 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.453 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:32:17 compute-1 nova_compute[185910]: 2026-02-16 13:32:17.453 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:18 compute-1 nova_compute[185910]: 2026-02-16 13:32:18.412 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:18 compute-1 nova_compute[185910]: 2026-02-16 13:32:18.453 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:32:18 compute-1 nova_compute[185910]: 2026-02-16 13:32:18.454 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:32:18 compute-1 nova_compute[185910]: 2026-02-16 13:32:18.454 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:32:19 compute-1 openstack_network_exporter[198096]: ERROR   13:32:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:32:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:32:19 compute-1 openstack_network_exporter[198096]: ERROR   13:32:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:32:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:32:19 compute-1 nova_compute[185910]: 2026-02-16 13:32:19.580 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:20 compute-1 nova_compute[185910]: 2026-02-16 13:32:20.861 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:32:20 compute-1 nova_compute[185910]: 2026-02-16 13:32:20.862 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:32:20 compute-1 nova_compute[185910]: 2026-02-16 13:32:20.862 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:32:20 compute-1 nova_compute[185910]: 2026-02-16 13:32:20.862 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 07689e3f-f214-4f57-a662-bc531b614c3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:32:23 compute-1 nova_compute[185910]: 2026-02-16 13:32:23.018 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updating instance_info_cache with network_info: [{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:32:23 compute-1 nova_compute[185910]: 2026-02-16 13:32:23.037 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:32:23 compute-1 nova_compute[185910]: 2026-02-16 13:32:23.038 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:32:23 compute-1 nova_compute[185910]: 2026-02-16 13:32:23.414 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:23 compute-1 podman[209981]: 2026-02-16 13:32:23.925747859 +0000 UTC m=+0.052974917 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:32:24 compute-1 sshd-session[210007]: Invalid user nagios from 188.166.42.159 port 38610
Feb 16 13:32:24 compute-1 nova_compute[185910]: 2026-02-16 13:32:24.584 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:24 compute-1 sshd-session[210007]: Connection closed by invalid user nagios 188.166.42.159 port 38610 [preauth]
Feb 16 13:32:26 compute-1 nova_compute[185910]: 2026-02-16 13:32:26.654 185914 DEBUG nova.compute.manager [None req-831ce854-d8c6-4970-b891-0cd0f932fdf8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 63898862-3dd6-49b3-9545-63882243296a in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 13:32:26 compute-1 nova_compute[185910]: 2026-02-16 13:32:26.700 185914 DEBUG nova.compute.provider_tree [None req-831ce854-d8c6-4970-b891-0cd0f932fdf8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 10 to 11 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:32:28 compute-1 nova_compute[185910]: 2026-02-16 13:32:28.436 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:29 compute-1 nova_compute[185910]: 2026-02-16 13:32:29.587 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:31 compute-1 nova_compute[185910]: 2026-02-16 13:32:31.905 185914 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Check if temp file /var/lib/nova/instances/tmp4z78b9dd exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:32:31 compute-1 nova_compute[185910]: 2026-02-16 13:32:31.906 185914 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4z78b9dd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='07689e3f-f214-4f57-a662-bc531b614c3d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:32:33 compute-1 nova_compute[185910]: 2026-02-16 13:32:33.440 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:34 compute-1 nova_compute[185910]: 2026-02-16 13:32:34.125 185914 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:34 compute-1 nova_compute[185910]: 2026-02-16 13:32:34.191 185914 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:34 compute-1 nova_compute[185910]: 2026-02-16 13:32:34.192 185914 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:32:34 compute-1 nova_compute[185910]: 2026-02-16 13:32:34.237 185914 DEBUG oslo_concurrency.processutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:32:34 compute-1 nova_compute[185910]: 2026-02-16 13:32:34.590 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:35 compute-1 podman[195236]: time="2026-02-16T13:32:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:32:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:32:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:32:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:32:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2635 "" "Go-http-client/1.1"
Feb 16 13:32:36 compute-1 sshd-session[210029]: Accepted publickey for nova from 192.168.122.100 port 38468 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:32:36 compute-1 systemd-logind[821]: New session 34 of user nova.
Feb 16 13:32:36 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:32:36 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:32:36 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:32:36 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:32:36 compute-1 systemd[210033]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:32:36 compute-1 systemd[210033]: Queued start job for default target Main User Target.
Feb 16 13:32:36 compute-1 systemd[210033]: Created slice User Application Slice.
Feb 16 13:32:36 compute-1 systemd[210033]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:32:36 compute-1 systemd[210033]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:32:36 compute-1 systemd[210033]: Reached target Paths.
Feb 16 13:32:36 compute-1 systemd[210033]: Reached target Timers.
Feb 16 13:32:36 compute-1 systemd[210033]: Starting D-Bus User Message Bus Socket...
Feb 16 13:32:36 compute-1 systemd[210033]: Starting Create User's Volatile Files and Directories...
Feb 16 13:32:36 compute-1 systemd[210033]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:32:36 compute-1 systemd[210033]: Reached target Sockets.
Feb 16 13:32:36 compute-1 systemd[210033]: Finished Create User's Volatile Files and Directories.
Feb 16 13:32:36 compute-1 systemd[210033]: Reached target Basic System.
Feb 16 13:32:36 compute-1 systemd[210033]: Reached target Main User Target.
Feb 16 13:32:36 compute-1 systemd[210033]: Startup finished in 109ms.
Feb 16 13:32:36 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:32:36 compute-1 systemd[1]: Started Session 34 of User nova.
Feb 16 13:32:36 compute-1 sshd-session[210029]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:32:37 compute-1 sshd-session[210048]: Received disconnect from 192.168.122.100 port 38468:11: disconnected by user
Feb 16 13:32:37 compute-1 sshd-session[210048]: Disconnected from user nova 192.168.122.100 port 38468
Feb 16 13:32:37 compute-1 sshd-session[210029]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:32:37 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Feb 16 13:32:37 compute-1 systemd-logind[821]: Session 34 logged out. Waiting for processes to exit.
Feb 16 13:32:37 compute-1 systemd-logind[821]: Removed session 34.
Feb 16 13:32:37 compute-1 nova_compute[185910]: 2026-02-16 13:32:37.625 185914 DEBUG nova.compute.manager [req-24c79b31-02dd-4682-87e2-d71a3965700a req-87302cdb-a8b1-400d-ad17-e5f3ce72272e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:32:37 compute-1 nova_compute[185910]: 2026-02-16 13:32:37.627 185914 DEBUG oslo_concurrency.lockutils [req-24c79b31-02dd-4682-87e2-d71a3965700a req-87302cdb-a8b1-400d-ad17-e5f3ce72272e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:37 compute-1 nova_compute[185910]: 2026-02-16 13:32:37.627 185914 DEBUG oslo_concurrency.lockutils [req-24c79b31-02dd-4682-87e2-d71a3965700a req-87302cdb-a8b1-400d-ad17-e5f3ce72272e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:37 compute-1 nova_compute[185910]: 2026-02-16 13:32:37.628 185914 DEBUG oslo_concurrency.lockutils [req-24c79b31-02dd-4682-87e2-d71a3965700a req-87302cdb-a8b1-400d-ad17-e5f3ce72272e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:37 compute-1 nova_compute[185910]: 2026-02-16 13:32:37.628 185914 DEBUG nova.compute.manager [req-24c79b31-02dd-4682-87e2-d71a3965700a req-87302cdb-a8b1-400d-ad17-e5f3ce72272e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:32:37 compute-1 nova_compute[185910]: 2026-02-16 13:32:37.628 185914 DEBUG nova.compute.manager [req-24c79b31-02dd-4682-87e2-d71a3965700a req-87302cdb-a8b1-400d-ad17-e5f3ce72272e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:32:37 compute-1 nova_compute[185910]: 2026-02-16 13:32:37.631 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:37 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:37.631 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:32:37 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:37.633 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.442 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.480 185914 INFO nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Took 4.24 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.481 185914 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.511 185914 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4z78b9dd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='07689e3f-f214-4f57-a662-bc531b614c3d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(692d197f-e44d-403a-acdb-38bd0bdc6861),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.539 185914 DEBUG nova.objects.instance [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 07689e3f-f214-4f57-a662-bc531b614c3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.541 185914 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.543 185914 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.543 185914 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.564 185914 DEBUG nova.virt.libvirt.vif [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:30:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1190069071',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1190069071',id=9,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:31:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9d212b8e966a499a9aad9b972bb7e76d',ramdisk_id='',reservation_id='r-wzp5jrw0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-464275700',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:31:11Z,user_data=None,user_id='8712c0037def471dabf14879c0a418ec',uuid=07689e3f-f214-4f57-a662-bc531b614c3d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.564 185914 DEBUG nova.network.os_vif_util [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.565 185914 DEBUG nova.network.os_vif_util [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.566 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:32:38 compute-1 nova_compute[185910]:   <mac address="fa:16:3e:ba:1f:94"/>
Feb 16 13:32:38 compute-1 nova_compute[185910]:   <model type="virtio"/>
Feb 16 13:32:38 compute-1 nova_compute[185910]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:32:38 compute-1 nova_compute[185910]:   <mtu size="1442"/>
Feb 16 13:32:38 compute-1 nova_compute[185910]:   <target dev="tap9ac0912f-d5"/>
Feb 16 13:32:38 compute-1 nova_compute[185910]: </interface>
Feb 16 13:32:38 compute-1 nova_compute[185910]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:32:38 compute-1 nova_compute[185910]: 2026-02-16 13:32:38.566 185914 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.046 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.047 185914 INFO nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.165 185914 INFO nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.593 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.669 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.669 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.769 185914 DEBUG nova.compute.manager [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.770 185914 DEBUG oslo_concurrency.lockutils [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.770 185914 DEBUG oslo_concurrency.lockutils [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.771 185914 DEBUG oslo_concurrency.lockutils [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.771 185914 DEBUG nova.compute.manager [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.772 185914 WARNING nova.compute.manager [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received unexpected event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with vm_state active and task_state migrating.
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.772 185914 DEBUG nova.compute.manager [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-changed-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.773 185914 DEBUG nova.compute.manager [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Refreshing instance network info cache due to event network-changed-9ac0912f-d593-4dad-bf05-01d7dd0b6677. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.773 185914 DEBUG oslo_concurrency.lockutils [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.774 185914 DEBUG oslo_concurrency.lockutils [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:32:39 compute-1 nova_compute[185910]: 2026-02-16 13:32:39.775 185914 DEBUG nova.network.neutron [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Refreshing network info cache for port 9ac0912f-d593-4dad-bf05-01d7dd0b6677 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:32:40 compute-1 nova_compute[185910]: 2026-02-16 13:32:40.174 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:32:40 compute-1 nova_compute[185910]: 2026-02-16 13:32:40.175 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:32:40 compute-1 nova_compute[185910]: 2026-02-16 13:32:40.679 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:32:40 compute-1 nova_compute[185910]: 2026-02-16 13:32:40.680 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:32:40 compute-1 podman[210059]: 2026-02-16 13:32:40.918956811 +0000 UTC m=+0.055623240 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1770267347, distribution-scope=public, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 16 13:32:40 compute-1 podman[210060]: 2026-02-16 13:32:40.947374136 +0000 UTC m=+0.081567437 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.184 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.185 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.688 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.688 185914 DEBUG nova.virt.libvirt.migration [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.706 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248761.7060628, 07689e3f-f214-4f57-a662-bc531b614c3d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.707 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] VM Paused (Lifecycle Event)
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.743 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.747 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.781 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:32:41 compute-1 kernel: tap9ac0912f-d5 (unregistering): left promiscuous mode
Feb 16 13:32:41 compute-1 NetworkManager[56388]: <info>  [1771248761.8382] device (tap9ac0912f-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.839 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.842 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:41 compute-1 ovn_controller[96285]: 2026-02-16T13:32:41Z|00086|binding|INFO|Releasing lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 from this chassis (sb_readonly=0)
Feb 16 13:32:41 compute-1 ovn_controller[96285]: 2026-02-16T13:32:41Z|00087|binding|INFO|Setting lport 9ac0912f-d593-4dad-bf05-01d7dd0b6677 down in Southbound
Feb 16 13:32:41 compute-1 ovn_controller[96285]: 2026-02-16T13:32:41Z|00088|binding|INFO|Removing iface tap9ac0912f-d5 ovn-installed in OVS
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.844 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:41 compute-1 nova_compute[185910]: 2026-02-16 13:32:41.847 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:41.850 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:1f:94 10.100.0.14'], port_security=['fa:16:3e:ba:1f:94 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '07689e3f-f214-4f57-a662-bc531b614c3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34e10b77-8ec0-4af1-a031-d83792585eee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d212b8e966a499a9aad9b972bb7e76d', 'neutron:revision_number': '8', 'neutron:security_group_ids': '145107b4-bbb8-4e69-b3bf-db62f38a1f3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1fdee5c0-c83c-45cf-986e-fa2b109e36c1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=9ac0912f-d593-4dad-bf05-01d7dd0b6677) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:32:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:41.852 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 9ac0912f-d593-4dad-bf05-01d7dd0b6677 in datapath 34e10b77-8ec0-4af1-a031-d83792585eee unbound from our chassis
Feb 16 13:32:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:41.853 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 34e10b77-8ec0-4af1-a031-d83792585eee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:32:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:41.855 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9a0178-34ad-44af-99cc-090d01a2f3ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:41.856 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee namespace which is not needed anymore
Feb 16 13:32:41 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000009.scope: Deactivated successfully.
Feb 16 13:32:41 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000009.scope: Consumed 17.222s CPU time.
Feb 16 13:32:41 compute-1 systemd-machined[155419]: Machine qemu-7-instance-00000009 terminated.
Feb 16 13:32:41 compute-1 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209706]: [NOTICE]   (209710) : haproxy version is 2.8.14-c23fe91
Feb 16 13:32:41 compute-1 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209706]: [NOTICE]   (209710) : path to executable is /usr/sbin/haproxy
Feb 16 13:32:41 compute-1 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209706]: [ALERT]    (209710) : Current worker (209713) exited with code 143 (Terminated)
Feb 16 13:32:41 compute-1 neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee[209706]: [WARNING]  (209710) : All workers exited. Exiting... (0)
Feb 16 13:32:41 compute-1 systemd[1]: libpod-a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0.scope: Deactivated successfully.
Feb 16 13:32:41 compute-1 podman[210123]: 2026-02-16 13:32:41.977516158 +0000 UTC m=+0.044491926 container died a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 16 13:32:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0-userdata-shm.mount: Deactivated successfully.
Feb 16 13:32:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-cd6a1d25bc523b1dc5f2bf65c1750479e5ad75ba6bd1e46312f9fdc914afb4bf-merged.mount: Deactivated successfully.
Feb 16 13:32:42 compute-1 podman[210123]: 2026-02-16 13:32:42.008070922 +0000 UTC m=+0.075046690 container cleanup a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:32:42 compute-1 systemd[1]: libpod-conmon-a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0.scope: Deactivated successfully.
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.065 185914 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.067 185914 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.067 185914 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:32:42 compute-1 podman[210154]: 2026-02-16 13:32:42.069183531 +0000 UTC m=+0.045118443 container remove a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 16 13:32:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:42.072 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1ebd159e-14c2-4c9d-a324-a67e9a028789]: (4, ('Mon Feb 16 01:32:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee (a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0)\na608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0\nMon Feb 16 01:32:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee (a608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0)\na608d1dbab176c0b55722e5a8a1f980377d3357100052f57fb042581dd73ebc0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:42.074 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[109b7302-8914-4ca3-b662-a8bf2cebe6c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:42.075 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap34e10b77-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.077 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:42 compute-1 kernel: tap34e10b77-80: left promiscuous mode
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.084 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:42.087 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[72b933fe-b877-4705-830f-a7ce72eb49ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:42.107 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[22e4994c-e115-4d60-be47-69e1d16de484]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:42.109 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b349f3-cab9-482b-a536-b37284c05257]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:42.121 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd79bb9-9cd4-408f-a783-520cebaa5e29]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475469, 'reachable_time': 17657, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210190, 'error': None, 'target': 'ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:42 compute-1 systemd[1]: run-netns-ovnmeta\x2d34e10b77\x2d8ec0\x2d4af1\x2da031\x2dd83792585eee.mount: Deactivated successfully.
Feb 16 13:32:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:42.124 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-34e10b77-8ec0-4af1-a031-d83792585eee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:32:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:42.125 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[15d98f4f-ddc9-4721-a00a-1bcfc7cfa22a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.137 185914 DEBUG nova.compute.manager [req-a3882ad9-d05f-4a6a-824e-10dac20c5fe1 req-68a711f1-e4fd-4a2f-9dee-405c717047a5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.138 185914 DEBUG oslo_concurrency.lockutils [req-a3882ad9-d05f-4a6a-824e-10dac20c5fe1 req-68a711f1-e4fd-4a2f-9dee-405c717047a5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.138 185914 DEBUG oslo_concurrency.lockutils [req-a3882ad9-d05f-4a6a-824e-10dac20c5fe1 req-68a711f1-e4fd-4a2f-9dee-405c717047a5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.139 185914 DEBUG oslo_concurrency.lockutils [req-a3882ad9-d05f-4a6a-824e-10dac20c5fe1 req-68a711f1-e4fd-4a2f-9dee-405c717047a5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.139 185914 DEBUG nova.compute.manager [req-a3882ad9-d05f-4a6a-824e-10dac20c5fe1 req-68a711f1-e4fd-4a2f-9dee-405c717047a5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.140 185914 DEBUG nova.compute.manager [req-a3882ad9-d05f-4a6a-824e-10dac20c5fe1 req-68a711f1-e4fd-4a2f-9dee-405c717047a5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.191 185914 DEBUG nova.virt.libvirt.guest [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '07689e3f-f214-4f57-a662-bc531b614c3d' (instance-00000009) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.192 185914 INFO nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Migration operation has completed
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.192 185914 INFO nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] _post_live_migration() is started..
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.254 185914 DEBUG nova.network.neutron [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updated VIF entry in instance network info cache for port 9ac0912f-d593-4dad-bf05-01d7dd0b6677. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.255 185914 DEBUG nova.network.neutron [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Updating instance_info_cache with network_info: [{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:32:42 compute-1 nova_compute[185910]: 2026-02-16 13:32:42.610 185914 DEBUG oslo_concurrency.lockutils [req-a5d80748-72d3-42a1-98b8-c384722d96a2 req-fd76456d-0de3-4f5c-b52c-c5861b908061 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-07689e3f-f214-4f57-a662-bc531b614c3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.444 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.641 185914 DEBUG nova.compute.manager [req-9f7dc5bb-ec48-4141-929b-f5ace2a97c25 req-66a70923-e3f9-42ab-a470-eedcc66a5fb5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.641 185914 DEBUG oslo_concurrency.lockutils [req-9f7dc5bb-ec48-4141-929b-f5ace2a97c25 req-66a70923-e3f9-42ab-a470-eedcc66a5fb5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.641 185914 DEBUG oslo_concurrency.lockutils [req-9f7dc5bb-ec48-4141-929b-f5ace2a97c25 req-66a70923-e3f9-42ab-a470-eedcc66a5fb5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.642 185914 DEBUG oslo_concurrency.lockutils [req-9f7dc5bb-ec48-4141-929b-f5ace2a97c25 req-66a70923-e3f9-42ab-a470-eedcc66a5fb5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.642 185914 DEBUG nova.compute.manager [req-9f7dc5bb-ec48-4141-929b-f5ace2a97c25 req-66a70923-e3f9-42ab-a470-eedcc66a5fb5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.642 185914 DEBUG nova.compute.manager [req-9f7dc5bb-ec48-4141-929b-f5ace2a97c25 req-66a70923-e3f9-42ab-a470-eedcc66a5fb5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-unplugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.924 185914 DEBUG nova.network.neutron [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port 9ac0912f-d593-4dad-bf05-01d7dd0b6677 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.925 185914 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.926 185914 DEBUG nova.virt.libvirt.vif [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:30:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1190069071',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1190069071',id=9,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:31:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9d212b8e966a499a9aad9b972bb7e76d',ramdisk_id='',reservation_id='r-wzp5jrw0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-464275700',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-464275700-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:32:29Z,user_data=None,user_id='8712c0037def471dabf14879c0a418ec',uuid=07689e3f-f214-4f57-a662-bc531b614c3d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.926 185914 DEBUG nova.network.os_vif_util [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "address": "fa:16:3e:ba:1f:94", "network": {"id": "34e10b77-8ec0-4af1-a031-d83792585eee", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-944275405-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d212b8e966a499a9aad9b972bb7e76d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ac0912f-d5", "ovs_interfaceid": "9ac0912f-d593-4dad-bf05-01d7dd0b6677", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.927 185914 DEBUG nova.network.os_vif_util [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.928 185914 DEBUG os_vif [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.930 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.931 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ac0912f-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.933 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.936 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.941 185914 INFO os_vif [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:1f:94,bridge_name='br-int',has_traffic_filtering=True,id=9ac0912f-d593-4dad-bf05-01d7dd0b6677,network=Network(34e10b77-8ec0-4af1-a031-d83792585eee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ac0912f-d5')
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.941 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.942 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.942 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.942 185914 DEBUG nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.943 185914 INFO nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Deleting instance files /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d_del
Feb 16 13:32:43 compute-1 nova_compute[185910]: 2026-02-16 13:32:43.943 185914 INFO nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Deletion of /var/lib/nova/instances/07689e3f-f214-4f57-a662-bc531b614c3d_del complete
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.297 185914 DEBUG nova.compute.manager [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.298 185914 DEBUG oslo_concurrency.lockutils [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.298 185914 DEBUG oslo_concurrency.lockutils [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.298 185914 DEBUG oslo_concurrency.lockutils [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.298 185914 DEBUG nova.compute.manager [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.299 185914 WARNING nova.compute.manager [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received unexpected event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with vm_state active and task_state migrating.
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.299 185914 DEBUG nova.compute.manager [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.299 185914 DEBUG oslo_concurrency.lockutils [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.300 185914 DEBUG oslo_concurrency.lockutils [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.300 185914 DEBUG oslo_concurrency.lockutils [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.300 185914 DEBUG nova.compute.manager [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.300 185914 WARNING nova.compute.manager [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received unexpected event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with vm_state active and task_state migrating.
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.301 185914 DEBUG nova.compute.manager [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.301 185914 DEBUG oslo_concurrency.lockutils [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.301 185914 DEBUG oslo_concurrency.lockutils [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.301 185914 DEBUG oslo_concurrency.lockutils [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.301 185914 DEBUG nova.compute.manager [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:32:44 compute-1 nova_compute[185910]: 2026-02-16 13:32:44.302 185914 WARNING nova.compute.manager [req-fb8c05c1-5930-4a49-ab83-d459b54729c3 req-ea202d96-0bce-4689-94b5-547266ae30ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received unexpected event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with vm_state active and task_state migrating.
Feb 16 13:32:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:32:45.635 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:32:46 compute-1 sshd-session[210191]: Invalid user admin from 146.190.226.24 port 45884
Feb 16 13:32:46 compute-1 sshd-session[210191]: Connection closed by invalid user admin 146.190.226.24 port 45884 [preauth]
Feb 16 13:32:46 compute-1 podman[210193]: 2026-02-16 13:32:46.509805837 +0000 UTC m=+0.100679170 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 16 13:32:46 compute-1 nova_compute[185910]: 2026-02-16 13:32:46.697 185914 DEBUG nova.compute.manager [req-19e0bfc0-6464-4e8b-8a1f-76650815e7b4 req-13ae63d2-bd68-47df-901d-c7ab5431f8a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:32:46 compute-1 nova_compute[185910]: 2026-02-16 13:32:46.697 185914 DEBUG oslo_concurrency.lockutils [req-19e0bfc0-6464-4e8b-8a1f-76650815e7b4 req-13ae63d2-bd68-47df-901d-c7ab5431f8a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:46 compute-1 nova_compute[185910]: 2026-02-16 13:32:46.697 185914 DEBUG oslo_concurrency.lockutils [req-19e0bfc0-6464-4e8b-8a1f-76650815e7b4 req-13ae63d2-bd68-47df-901d-c7ab5431f8a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:46 compute-1 nova_compute[185910]: 2026-02-16 13:32:46.698 185914 DEBUG oslo_concurrency.lockutils [req-19e0bfc0-6464-4e8b-8a1f-76650815e7b4 req-13ae63d2-bd68-47df-901d-c7ab5431f8a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:46 compute-1 nova_compute[185910]: 2026-02-16 13:32:46.698 185914 DEBUG nova.compute.manager [req-19e0bfc0-6464-4e8b-8a1f-76650815e7b4 req-13ae63d2-bd68-47df-901d-c7ab5431f8a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] No waiting events found dispatching network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:32:46 compute-1 nova_compute[185910]: 2026-02-16 13:32:46.698 185914 WARNING nova.compute.manager [req-19e0bfc0-6464-4e8b-8a1f-76650815e7b4 req-13ae63d2-bd68-47df-901d-c7ab5431f8a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Received unexpected event network-vif-plugged-9ac0912f-d593-4dad-bf05-01d7dd0b6677 for instance with vm_state active and task_state migrating.
Feb 16 13:32:47 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:32:47 compute-1 systemd[210033]: Activating special unit Exit the Session...
Feb 16 13:32:47 compute-1 systemd[210033]: Stopped target Main User Target.
Feb 16 13:32:47 compute-1 systemd[210033]: Stopped target Basic System.
Feb 16 13:32:47 compute-1 systemd[210033]: Stopped target Paths.
Feb 16 13:32:47 compute-1 systemd[210033]: Stopped target Sockets.
Feb 16 13:32:47 compute-1 systemd[210033]: Stopped target Timers.
Feb 16 13:32:47 compute-1 systemd[210033]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:32:47 compute-1 systemd[210033]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:32:47 compute-1 systemd[210033]: Closed D-Bus User Message Bus Socket.
Feb 16 13:32:47 compute-1 systemd[210033]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:32:47 compute-1 systemd[210033]: Removed slice User Application Slice.
Feb 16 13:32:47 compute-1 systemd[210033]: Reached target Shutdown.
Feb 16 13:32:47 compute-1 systemd[210033]: Finished Exit the Session.
Feb 16 13:32:47 compute-1 systemd[210033]: Reached target Exit the Session.
Feb 16 13:32:47 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:32:47 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:32:47 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:32:47 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:32:47 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:32:47 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:32:47 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:32:48 compute-1 nova_compute[185910]: 2026-02-16 13:32:48.448 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:48 compute-1 nova_compute[185910]: 2026-02-16 13:32:48.934 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:49 compute-1 openstack_network_exporter[198096]: ERROR   13:32:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:32:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:32:49 compute-1 openstack_network_exporter[198096]: ERROR   13:32:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:32:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:32:53 compute-1 nova_compute[185910]: 2026-02-16 13:32:53.449 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:53 compute-1 nova_compute[185910]: 2026-02-16 13:32:53.937 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:54 compute-1 podman[210220]: 2026-02-16 13:32:54.916911298 +0000 UTC m=+0.059327831 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:32:55 compute-1 nova_compute[185910]: 2026-02-16 13:32:55.890 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:55 compute-1 nova_compute[185910]: 2026-02-16 13:32:55.890 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:55 compute-1 nova_compute[185910]: 2026-02-16 13:32:55.890 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "07689e3f-f214-4f57-a662-bc531b614c3d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:55 compute-1 nova_compute[185910]: 2026-02-16 13:32:55.948 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:55 compute-1 nova_compute[185910]: 2026-02-16 13:32:55.949 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:55 compute-1 nova_compute[185910]: 2026-02-16 13:32:55.949 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:55 compute-1 nova_compute[185910]: 2026-02-16 13:32:55.949 185914 DEBUG nova.compute.resource_tracker [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.074 185914 WARNING nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.075 185914 DEBUG nova.compute.resource_tracker [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5808MB free_disk=73.22369003295898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.075 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.075 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.150 185914 DEBUG nova.compute.resource_tracker [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance 07689e3f-f214-4f57-a662-bc531b614c3d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.179 185914 DEBUG nova.compute.resource_tracker [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.229 185914 DEBUG nova.compute.resource_tracker [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration 692d197f-e44d-403a-acdb-38bd0bdc6861 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.230 185914 DEBUG nova.compute.resource_tracker [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.230 185914 DEBUG nova.compute.resource_tracker [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.416 185914 DEBUG nova.compute.provider_tree [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.472 185914 DEBUG nova.scheduler.client.report [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.510 185914 DEBUG nova.compute.resource_tracker [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.511 185914 DEBUG oslo_concurrency.lockutils [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.516 185914 INFO nova.compute.manager [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.749 185914 INFO nova.scheduler.client.report [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 692d197f-e44d-403a-acdb-38bd0bdc6861
Feb 16 13:32:56 compute-1 nova_compute[185910]: 2026-02-16 13:32:56.750 185914 DEBUG nova.virt.libvirt.driver [None req-d7cd74d4-7171-42db-a947-41618a902942 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:32:57 compute-1 nova_compute[185910]: 2026-02-16 13:32:57.064 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771248762.062691, 07689e3f-f214-4f57-a662-bc531b614c3d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:32:57 compute-1 nova_compute[185910]: 2026-02-16 13:32:57.064 185914 INFO nova.compute.manager [-] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] VM Stopped (Lifecycle Event)
Feb 16 13:32:57 compute-1 nova_compute[185910]: 2026-02-16 13:32:57.125 185914 DEBUG nova.compute.manager [None req-93a31dd6-81aa-4263-9c21-8e2ec2c15025 - - - - - -] [instance: 07689e3f-f214-4f57-a662-bc531b614c3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:32:58 compute-1 nova_compute[185910]: 2026-02-16 13:32:58.451 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:32:58 compute-1 nova_compute[185910]: 2026-02-16 13:32:58.939 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:33:03.340 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:33:03.341 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:33:03.341 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:03 compute-1 nova_compute[185910]: 2026-02-16 13:33:03.453 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:03 compute-1 nova_compute[185910]: 2026-02-16 13:33:03.942 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:05 compute-1 podman[195236]: time="2026-02-16T13:33:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:33:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:33:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:33:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:33:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 13:33:08 compute-1 nova_compute[185910]: 2026-02-16 13:33:08.455 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:08 compute-1 nova_compute[185910]: 2026-02-16 13:33:08.944 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:09 compute-1 nova_compute[185910]: 2026-02-16 13:33:09.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:10 compute-1 nova_compute[185910]: 2026-02-16 13:33:10.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:10 compute-1 nova_compute[185910]: 2026-02-16 13:33:10.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:10 compute-1 nova_compute[185910]: 2026-02-16 13:33:10.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:11 compute-1 podman[210245]: 2026-02-16 13:33:11.90945361 +0000 UTC m=+0.053185873 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., release=1770267347, vendor=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container)
Feb 16 13:33:11 compute-1 podman[210246]: 2026-02-16 13:33:11.93767911 +0000 UTC m=+0.076655123 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 13:33:13 compute-1 nova_compute[185910]: 2026-02-16 13:33:13.458 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:13 compute-1 nova_compute[185910]: 2026-02-16 13:33:13.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:13 compute-1 nova_compute[185910]: 2026-02-16 13:33:13.946 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:14 compute-1 nova_compute[185910]: 2026-02-16 13:33:14.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:14 compute-1 nova_compute[185910]: 2026-02-16 13:33:14.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:33:14 compute-1 nova_compute[185910]: 2026-02-16 13:33:14.661 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:33:15 compute-1 nova_compute[185910]: 2026-02-16 13:33:15.656 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:16 compute-1 nova_compute[185910]: 2026-02-16 13:33:16.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:16 compute-1 podman[210286]: 2026-02-16 13:33:16.930185358 +0000 UTC m=+0.075266786 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 13:33:17 compute-1 sshd-session[210314]: Invalid user guest from 188.166.42.159 port 58652
Feb 16 13:33:18 compute-1 sshd-session[210314]: Connection closed by invalid user guest 188.166.42.159 port 58652 [preauth]
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.402 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.402 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.402 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.402 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.460 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.554 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.555 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5828MB free_disk=73.22368240356445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.556 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.556 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:33:18 compute-1 nova_compute[185910]: 2026-02-16 13:33:18.950 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:19 compute-1 nova_compute[185910]: 2026-02-16 13:33:19.109 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:33:19 compute-1 nova_compute[185910]: 2026-02-16 13:33:19.110 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:33:19 compute-1 nova_compute[185910]: 2026-02-16 13:33:19.173 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:33:19 compute-1 nova_compute[185910]: 2026-02-16 13:33:19.230 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:33:19 compute-1 nova_compute[185910]: 2026-02-16 13:33:19.232 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:33:19 compute-1 nova_compute[185910]: 2026-02-16 13:33:19.232 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:33:19 compute-1 nova_compute[185910]: 2026-02-16 13:33:19.233 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:19 compute-1 openstack_network_exporter[198096]: ERROR   13:33:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:33:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:33:19 compute-1 openstack_network_exporter[198096]: ERROR   13:33:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:33:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:33:20 compute-1 nova_compute[185910]: 2026-02-16 13:33:20.274 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:20 compute-1 nova_compute[185910]: 2026-02-16 13:33:20.275 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:33:20 compute-1 nova_compute[185910]: 2026-02-16 13:33:20.275 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:33:20 compute-1 nova_compute[185910]: 2026-02-16 13:33:20.298 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:33:20 compute-1 nova_compute[185910]: 2026-02-16 13:33:20.299 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:20 compute-1 nova_compute[185910]: 2026-02-16 13:33:20.299 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:33:20 compute-1 nova_compute[185910]: 2026-02-16 13:33:20.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:20 compute-1 nova_compute[185910]: 2026-02-16 13:33:20.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:33:23 compute-1 nova_compute[185910]: 2026-02-16 13:33:23.461 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:23 compute-1 nova_compute[185910]: 2026-02-16 13:33:23.953 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:25 compute-1 nova_compute[185910]: 2026-02-16 13:33:25.655 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:25 compute-1 podman[210317]: 2026-02-16 13:33:25.939402318 +0000 UTC m=+0.074281639 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:33:28 compute-1 nova_compute[185910]: 2026-02-16 13:33:28.465 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:28 compute-1 nova_compute[185910]: 2026-02-16 13:33:28.955 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:33 compute-1 nova_compute[185910]: 2026-02-16 13:33:33.475 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:33 compute-1 nova_compute[185910]: 2026-02-16 13:33:33.957 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:35 compute-1 podman[195236]: time="2026-02-16T13:33:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:33:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:33:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:33:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:33:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 16 13:33:38 compute-1 nova_compute[185910]: 2026-02-16 13:33:38.477 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:38 compute-1 nova_compute[185910]: 2026-02-16 13:33:38.960 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:42 compute-1 podman[210342]: 2026-02-16 13:33:42.912666396 +0000 UTC m=+0.052196436 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, config_id=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, distribution-scope=public, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 16 13:33:42 compute-1 podman[210343]: 2026-02-16 13:33:42.934887823 +0000 UTC m=+0.063659989 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 16 13:33:43 compute-1 ovn_controller[96285]: 2026-02-16T13:33:43Z|00089|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 13:33:43 compute-1 nova_compute[185910]: 2026-02-16 13:33:43.511 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:43 compute-1 nova_compute[185910]: 2026-02-16 13:33:43.962 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:47 compute-1 podman[210382]: 2026-02-16 13:33:47.930821933 +0000 UTC m=+0.075658916 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 13:33:48 compute-1 nova_compute[185910]: 2026-02-16 13:33:48.514 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:48 compute-1 nova_compute[185910]: 2026-02-16 13:33:48.965 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:49 compute-1 openstack_network_exporter[198096]: ERROR   13:33:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:33:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:33:49 compute-1 openstack_network_exporter[198096]: ERROR   13:33:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:33:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:33:49 compute-1 sshd-session[210409]: Invalid user ubnt from 2.57.122.210 port 55304
Feb 16 13:33:49 compute-1 sshd-session[210409]: Connection closed by invalid user ubnt 2.57.122.210 port 55304 [preauth]
Feb 16 13:33:51 compute-1 nova_compute[185910]: 2026-02-16 13:33:51.896 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:33:52 compute-1 sshd-session[210411]: Invalid user admin from 146.190.226.24 port 59166
Feb 16 13:33:52 compute-1 sshd-session[210411]: Connection closed by invalid user admin 146.190.226.24 port 59166 [preauth]
Feb 16 13:33:53 compute-1 nova_compute[185910]: 2026-02-16 13:33:53.517 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:53 compute-1 nova_compute[185910]: 2026-02-16 13:33:53.967 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:56 compute-1 podman[210413]: 2026-02-16 13:33:56.898833038 +0000 UTC m=+0.039531310 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:33:58 compute-1 nova_compute[185910]: 2026-02-16 13:33:58.520 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:33:58 compute-1 nova_compute[185910]: 2026-02-16 13:33:58.970 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:34:03.341 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:34:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:34:03.342 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:34:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:34:03.342 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:34:03 compute-1 nova_compute[185910]: 2026-02-16 13:34:03.522 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:03 compute-1 nova_compute[185910]: 2026-02-16 13:34:03.972 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:05 compute-1 podman[195236]: time="2026-02-16T13:34:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:34:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:34:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:34:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:34:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 16 13:34:08 compute-1 nova_compute[185910]: 2026-02-16 13:34:08.524 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:08 compute-1 nova_compute[185910]: 2026-02-16 13:34:08.974 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:09 compute-1 sshd-session[210438]: Invalid user weblogic from 188.166.42.159 port 49906
Feb 16 13:34:09 compute-1 sshd-session[210438]: Connection closed by invalid user weblogic 188.166.42.159 port 49906 [preauth]
Feb 16 13:34:10 compute-1 nova_compute[185910]: 2026-02-16 13:34:10.692 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:10 compute-1 nova_compute[185910]: 2026-02-16 13:34:10.692 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:11 compute-1 nova_compute[185910]: 2026-02-16 13:34:11.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:12 compute-1 nova_compute[185910]: 2026-02-16 13:34:12.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:13 compute-1 nova_compute[185910]: 2026-02-16 13:34:13.526 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:13 compute-1 podman[210441]: 2026-02-16 13:34:13.914091962 +0000 UTC m=+0.048105414 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 16 13:34:13 compute-1 podman[210440]: 2026-02-16 13:34:13.941979773 +0000 UTC m=+0.080212991 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.expose-services=)
Feb 16 13:34:13 compute-1 nova_compute[185910]: 2026-02-16 13:34:13.976 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:14 compute-1 nova_compute[185910]: 2026-02-16 13:34:14.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:15 compute-1 nova_compute[185910]: 2026-02-16 13:34:15.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.660 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.661 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.661 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.661 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.801 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.802 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5856MB free_disk=73.22367858886719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.802 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.802 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.983 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:34:16 compute-1 nova_compute[185910]: 2026-02-16 13:34:16.983 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:34:17 compute-1 nova_compute[185910]: 2026-02-16 13:34:17.136 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:34:17 compute-1 nova_compute[185910]: 2026-02-16 13:34:17.157 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:34:17 compute-1 nova_compute[185910]: 2026-02-16 13:34:17.158 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:34:17 compute-1 nova_compute[185910]: 2026-02-16 13:34:17.158 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:34:18 compute-1 nova_compute[185910]: 2026-02-16 13:34:18.528 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:18 compute-1 podman[210479]: 2026-02-16 13:34:18.963438432 +0000 UTC m=+0.101382699 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:34:18 compute-1 nova_compute[185910]: 2026-02-16 13:34:18.978 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:19 compute-1 openstack_network_exporter[198096]: ERROR   13:34:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:34:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:34:19 compute-1 openstack_network_exporter[198096]: ERROR   13:34:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:34:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:34:19 compute-1 nova_compute[185910]: 2026-02-16 13:34:19.830 185914 DEBUG nova.compute.manager [None req-b5c22755-fd33-4596-a69e-02f198f63721 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 63898862-3dd6-49b3-9545-63882243296a in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 13:34:19 compute-1 nova_compute[185910]: 2026-02-16 13:34:19.926 185914 DEBUG nova.compute.provider_tree [None req-b5c22755-fd33-4596-a69e-02f198f63721 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 11 to 14 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:34:20 compute-1 nova_compute[185910]: 2026-02-16 13:34:20.158 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:20 compute-1 nova_compute[185910]: 2026-02-16 13:34:20.158 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:34:20 compute-1 nova_compute[185910]: 2026-02-16 13:34:20.159 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:34:20 compute-1 nova_compute[185910]: 2026-02-16 13:34:20.212 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:34:20 compute-1 nova_compute[185910]: 2026-02-16 13:34:20.213 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:34:20 compute-1 nova_compute[185910]: 2026-02-16 13:34:20.213 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:34:23 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:34:23.128 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:34:23 compute-1 nova_compute[185910]: 2026-02-16 13:34:23.128 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:23 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:34:23.129 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:34:23 compute-1 nova_compute[185910]: 2026-02-16 13:34:23.529 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:23 compute-1 nova_compute[185910]: 2026-02-16 13:34:23.979 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:27 compute-1 nova_compute[185910]: 2026-02-16 13:34:27.262 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:27 compute-1 podman[210506]: 2026-02-16 13:34:27.905689273 +0000 UTC m=+0.042516801 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:34:28 compute-1 nova_compute[185910]: 2026-02-16 13:34:28.537 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:28 compute-1 nova_compute[185910]: 2026-02-16 13:34:28.982 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:31 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:34:31.134 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:34:33 compute-1 nova_compute[185910]: 2026-02-16 13:34:33.541 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:33 compute-1 nova_compute[185910]: 2026-02-16 13:34:33.985 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:35 compute-1 podman[195236]: time="2026-02-16T13:34:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:34:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:34:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:34:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:34:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:34:38 compute-1 nova_compute[185910]: 2026-02-16 13:34:38.543 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:38 compute-1 nova_compute[185910]: 2026-02-16 13:34:38.988 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:43 compute-1 nova_compute[185910]: 2026-02-16 13:34:43.544 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:43 compute-1 nova_compute[185910]: 2026-02-16 13:34:43.990 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:44 compute-1 podman[210532]: 2026-02-16 13:34:44.908901159 +0000 UTC m=+0.049220345 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:34:44 compute-1 podman[210531]: 2026-02-16 13:34:44.940901552 +0000 UTC m=+0.080078327 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 16 13:34:48 compute-1 nova_compute[185910]: 2026-02-16 13:34:48.546 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:48 compute-1 nova_compute[185910]: 2026-02-16 13:34:48.992 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:49 compute-1 openstack_network_exporter[198096]: ERROR   13:34:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:34:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:34:49 compute-1 openstack_network_exporter[198096]: ERROR   13:34:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:34:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:34:49 compute-1 podman[210571]: 2026-02-16 13:34:49.91983938 +0000 UTC m=+0.063826214 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:34:53 compute-1 nova_compute[185910]: 2026-02-16 13:34:53.548 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:53 compute-1 nova_compute[185910]: 2026-02-16 13:34:53.994 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:58 compute-1 nova_compute[185910]: 2026-02-16 13:34:58.550 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:58 compute-1 podman[210599]: 2026-02-16 13:34:58.905085665 +0000 UTC m=+0.044004642 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:34:58 compute-1 nova_compute[185910]: 2026-02-16 13:34:58.997 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:34:59 compute-1 sshd-session[210598]: Invalid user admin from 146.190.226.24 port 34080
Feb 16 13:34:59 compute-1 sshd-session[210598]: Connection closed by invalid user admin 146.190.226.24 port 34080 [preauth]
Feb 16 13:35:02 compute-1 sshd-session[210624]: Invalid user mysql from 188.166.42.159 port 37048
Feb 16 13:35:02 compute-1 sshd-session[210624]: Connection closed by invalid user mysql 188.166.42.159 port 37048 [preauth]
Feb 16 13:35:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:03.342 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:03.343 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:03.343 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:03 compute-1 nova_compute[185910]: 2026-02-16 13:35:03.551 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:04 compute-1 nova_compute[185910]: 2026-02-16 13:35:04.041 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:05 compute-1 ovn_controller[96285]: 2026-02-16T13:35:05Z|00090|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 13:35:05 compute-1 podman[195236]: time="2026-02-16T13:35:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:35:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:35:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:35:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:35:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:35:08 compute-1 nova_compute[185910]: 2026-02-16 13:35:08.572 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:09 compute-1 nova_compute[185910]: 2026-02-16 13:35:09.043 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:11 compute-1 nova_compute[185910]: 2026-02-16 13:35:11.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:12 compute-1 nova_compute[185910]: 2026-02-16 13:35:12.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:13 compute-1 nova_compute[185910]: 2026-02-16 13:35:13.574 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:13 compute-1 nova_compute[185910]: 2026-02-16 13:35:13.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:13 compute-1 nova_compute[185910]: 2026-02-16 13:35:13.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:14 compute-1 nova_compute[185910]: 2026-02-16 13:35:14.045 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:14 compute-1 nova_compute[185910]: 2026-02-16 13:35:14.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:15 compute-1 podman[210626]: 2026-02-16 13:35:15.906110102 +0000 UTC m=+0.047244231 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=openstack_network_exporter, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter)
Feb 16 13:35:15 compute-1 podman[210627]: 2026-02-16 13:35:15.921972775 +0000 UTC m=+0.056659168 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 13:35:16 compute-1 nova_compute[185910]: 2026-02-16 13:35:16.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:16 compute-1 nova_compute[185910]: 2026-02-16 13:35:16.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:16 compute-1 nova_compute[185910]: 2026-02-16 13:35:16.855 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:16 compute-1 nova_compute[185910]: 2026-02-16 13:35:16.856 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:16 compute-1 nova_compute[185910]: 2026-02-16 13:35:16.856 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:16 compute-1 nova_compute[185910]: 2026-02-16 13:35:16.856 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.015 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.017 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5860MB free_disk=73.22368240356445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.017 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.017 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.097 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.097 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.127 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.153 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.154 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:35:17 compute-1 nova_compute[185910]: 2026-02-16 13:35:17.155 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:18 compute-1 nova_compute[185910]: 2026-02-16 13:35:18.622 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:19 compute-1 nova_compute[185910]: 2026-02-16 13:35:19.047 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:19 compute-1 openstack_network_exporter[198096]: ERROR   13:35:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:35:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:35:19 compute-1 openstack_network_exporter[198096]: ERROR   13:35:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:35:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:35:20 compute-1 podman[210666]: 2026-02-16 13:35:20.936405819 +0000 UTC m=+0.077315772 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:35:21 compute-1 nova_compute[185910]: 2026-02-16 13:35:21.155 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:21 compute-1 nova_compute[185910]: 2026-02-16 13:35:21.155 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:35:21 compute-1 nova_compute[185910]: 2026-02-16 13:35:21.156 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:35:21 compute-1 nova_compute[185910]: 2026-02-16 13:35:21.177 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:35:21 compute-1 nova_compute[185910]: 2026-02-16 13:35:21.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:21 compute-1 nova_compute[185910]: 2026-02-16 13:35:21.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:35:23 compute-1 nova_compute[185910]: 2026-02-16 13:35:23.624 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:24 compute-1 nova_compute[185910]: 2026-02-16 13:35:24.049 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:26.284 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:35:26 compute-1 nova_compute[185910]: 2026-02-16 13:35:26.284 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:26.285 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:35:26 compute-1 nova_compute[185910]: 2026-02-16 13:35:26.627 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:35:28 compute-1 nova_compute[185910]: 2026-02-16 13:35:28.625 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:29 compute-1 nova_compute[185910]: 2026-02-16 13:35:29.051 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:29 compute-1 podman[210693]: 2026-02-16 13:35:29.902809 +0000 UTC m=+0.048434603 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:35:33 compute-1 nova_compute[185910]: 2026-02-16 13:35:33.626 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:34 compute-1 nova_compute[185910]: 2026-02-16 13:35:34.054 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:35.288 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:35:35 compute-1 podman[195236]: time="2026-02-16T13:35:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:35:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:35:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:35:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:35:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 16 13:35:38 compute-1 nova_compute[185910]: 2026-02-16 13:35:38.628 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:39 compute-1 nova_compute[185910]: 2026-02-16 13:35:39.055 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:43 compute-1 nova_compute[185910]: 2026-02-16 13:35:43.631 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:44 compute-1 nova_compute[185910]: 2026-02-16 13:35:44.059 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:46 compute-1 podman[210718]: 2026-02-16 13:35:46.907774653 +0000 UTC m=+0.044392243 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 13:35:46 compute-1 podman[210717]: 2026-02-16 13:35:46.907731212 +0000 UTC m=+0.048753772 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:35:48 compute-1 nova_compute[185910]: 2026-02-16 13:35:48.687 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:49 compute-1 nova_compute[185910]: 2026-02-16 13:35:49.061 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:49 compute-1 openstack_network_exporter[198096]: ERROR   13:35:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:35:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:35:49 compute-1 openstack_network_exporter[198096]: ERROR   13:35:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:35:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:35:49 compute-1 nova_compute[185910]: 2026-02-16 13:35:49.976 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:49 compute-1 nova_compute[185910]: 2026-02-16 13:35:49.977 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.001 185914 DEBUG nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.096 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.096 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.103 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.104 185914 INFO nova.compute.claims [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.240 185914 DEBUG nova.compute.provider_tree [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.265 185914 DEBUG nova.scheduler.client.report [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.287 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.289 185914 DEBUG nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.331 185914 DEBUG nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.331 185914 DEBUG nova.network.neutron [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.351 185914 INFO nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.370 185914 DEBUG nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.501 185914 DEBUG nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.502 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.502 185914 INFO nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Creating image(s)
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.503 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.503 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.504 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.516 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.583 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.584 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.585 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.598 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.660 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.661 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.692 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.693 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.694 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.742 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.743 185914 DEBUG nova.virt.disk.api [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.743 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.804 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.806 185914 DEBUG nova.virt.disk.api [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.806 185914 DEBUG nova.objects.instance [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid fea12b84-b444-4299-a1d9-2e974fbb93e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.826 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.827 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Ensure instance console log exists: /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.828 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.828 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:50 compute-1 nova_compute[185910]: 2026-02-16 13:35:50.828 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:51 compute-1 nova_compute[185910]: 2026-02-16 13:35:51.157 185914 DEBUG nova.policy [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:35:51 compute-1 podman[210771]: 2026-02-16 13:35:51.937183019 +0000 UTC m=+0.077660301 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:35:52 compute-1 nova_compute[185910]: 2026-02-16 13:35:52.243 185914 DEBUG nova.network.neutron [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Successfully created port: 0b908a4c-c96e-4244-b4b4-87f4ac6110bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:35:53 compute-1 nova_compute[185910]: 2026-02-16 13:35:53.119 185914 DEBUG nova.network.neutron [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Successfully updated port: 0b908a4c-c96e-4244-b4b4-87f4ac6110bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:35:53 compute-1 nova_compute[185910]: 2026-02-16 13:35:53.136 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:35:53 compute-1 nova_compute[185910]: 2026-02-16 13:35:53.137 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:35:53 compute-1 nova_compute[185910]: 2026-02-16 13:35:53.137 185914 DEBUG nova.network.neutron [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:35:53 compute-1 nova_compute[185910]: 2026-02-16 13:35:53.224 185914 DEBUG nova.compute.manager [req-df046dd0-d2bd-4419-8ecd-a641dc27bd21 req-cac4bde4-03eb-4107-bdc7-e0c7d4406e56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-changed-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:35:53 compute-1 nova_compute[185910]: 2026-02-16 13:35:53.225 185914 DEBUG nova.compute.manager [req-df046dd0-d2bd-4419-8ecd-a641dc27bd21 req-cac4bde4-03eb-4107-bdc7-e0c7d4406e56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Refreshing instance network info cache due to event network-changed-0b908a4c-c96e-4244-b4b4-87f4ac6110bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:35:53 compute-1 nova_compute[185910]: 2026-02-16 13:35:53.225 185914 DEBUG oslo_concurrency.lockutils [req-df046dd0-d2bd-4419-8ecd-a641dc27bd21 req-cac4bde4-03eb-4107-bdc7-e0c7d4406e56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:35:53 compute-1 nova_compute[185910]: 2026-02-16 13:35:53.321 185914 DEBUG nova.network.neutron [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:35:53 compute-1 nova_compute[185910]: 2026-02-16 13:35:53.689 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.063 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.578 185914 DEBUG nova.network.neutron [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updating instance_info_cache with network_info: [{"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.608 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.608 185914 DEBUG nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Instance network_info: |[{"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.609 185914 DEBUG oslo_concurrency.lockutils [req-df046dd0-d2bd-4419-8ecd-a641dc27bd21 req-cac4bde4-03eb-4107-bdc7-e0c7d4406e56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.609 185914 DEBUG nova.network.neutron [req-df046dd0-d2bd-4419-8ecd-a641dc27bd21 req-cac4bde4-03eb-4107-bdc7-e0c7d4406e56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Refreshing network info cache for port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.612 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Start _get_guest_xml network_info=[{"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.617 185914 WARNING nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.620 185914 DEBUG nova.virt.libvirt.host [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.621 185914 DEBUG nova.virt.libvirt.host [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.631 185914 DEBUG nova.virt.libvirt.host [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.631 185914 DEBUG nova.virt.libvirt.host [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.633 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.633 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.634 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.634 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.634 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.634 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.634 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.635 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.635 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.635 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.635 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.636 185914 DEBUG nova.virt.hardware [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.640 185914 DEBUG nova.virt.libvirt.vif [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:35:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-276078181',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-276078181',id=11,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-14c0jw0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:35:50Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=fea12b84-b444-4299-a1d9-2e974fbb93e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.640 185914 DEBUG nova.network.os_vif_util [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.641 185914 DEBUG nova.network.os_vif_util [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.642 185914 DEBUG nova.objects.instance [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid fea12b84-b444-4299-a1d9-2e974fbb93e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.664 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <uuid>fea12b84-b444-4299-a1d9-2e974fbb93e0</uuid>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <name>instance-0000000b</name>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteStrategies-server-276078181</nova:name>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:35:54</nova:creationTime>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:35:54 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:35:54 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:35:54 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:35:54 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:35:54 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:35:54 compute-1 nova_compute[185910]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:35:54 compute-1 nova_compute[185910]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:35:54 compute-1 nova_compute[185910]:         <nova:port uuid="0b908a4c-c96e-4244-b4b4-87f4ac6110bd">
Feb 16 13:35:54 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <system>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <entry name="serial">fea12b84-b444-4299-a1d9-2e974fbb93e0</entry>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <entry name="uuid">fea12b84-b444-4299-a1d9-2e974fbb93e0</entry>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     </system>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <os>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   </os>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <features>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   </features>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:37:03:5a"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <target dev="tap0b908a4c-c9"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/console.log" append="off"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <video>
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     </video>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:35:54 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:35:54 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:35:54 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:35:54 compute-1 nova_compute[185910]: </domain>
Feb 16 13:35:54 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.665 185914 DEBUG nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Preparing to wait for external event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.666 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.666 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.666 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.667 185914 DEBUG nova.virt.libvirt.vif [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:35:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-276078181',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-276078181',id=11,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-14c0jw0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:35:50Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=fea12b84-b444-4299-a1d9-2e974fbb93e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.667 185914 DEBUG nova.network.os_vif_util [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.668 185914 DEBUG nova.network.os_vif_util [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.668 185914 DEBUG os_vif [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.669 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.669 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.669 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.672 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.672 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b908a4c-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.672 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b908a4c-c9, col_values=(('external_ids', {'iface-id': '0b908a4c-c96e-4244-b4b4-87f4ac6110bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:03:5a', 'vm-uuid': 'fea12b84-b444-4299-a1d9-2e974fbb93e0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.674 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:54 compute-1 NetworkManager[56388]: <info>  [1771248954.6756] manager: (tap0b908a4c-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.680 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.682 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.683 185914 INFO os_vif [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9')
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.734 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.734 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.734 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:37:03:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:35:54 compute-1 nova_compute[185910]: 2026-02-16 13:35:54.735 185914 INFO nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Using config drive
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.226 185914 INFO nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Creating config drive at /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.230 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmplztr8isn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.349 185914 DEBUG oslo_concurrency.processutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmplztr8isn" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:35:55 compute-1 kernel: tap0b908a4c-c9: entered promiscuous mode
Feb 16 13:35:55 compute-1 NetworkManager[56388]: <info>  [1771248955.3969] manager: (tap0b908a4c-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.434 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:55 compute-1 ovn_controller[96285]: 2026-02-16T13:35:55Z|00091|binding|INFO|Claiming lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd for this chassis.
Feb 16 13:35:55 compute-1 ovn_controller[96285]: 2026-02-16T13:35:55Z|00092|binding|INFO|0b908a4c-c96e-4244-b4b4-87f4ac6110bd: Claiming fa:16:3e:37:03:5a 10.100.0.14
Feb 16 13:35:55 compute-1 systemd-udevd[210814]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.437 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.450 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:03:5a 10.100.0.14'], port_security=['fa:16:3e:37:03:5a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fea12b84-b444-4299-a1d9-2e974fbb93e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=0b908a4c-c96e-4244-b4b4-87f4ac6110bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.451 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.453 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:35:55 compute-1 NetworkManager[56388]: <info>  [1771248955.4555] device (tap0b908a4c-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:35:55 compute-1 NetworkManager[56388]: <info>  [1771248955.4563] device (tap0b908a4c-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:35:55 compute-1 ovn_controller[96285]: 2026-02-16T13:35:55Z|00093|binding|INFO|Setting lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd ovn-installed in OVS
Feb 16 13:35:55 compute-1 ovn_controller[96285]: 2026-02-16T13:35:55Z|00094|binding|INFO|Setting lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd up in Southbound
Feb 16 13:35:55 compute-1 systemd-machined[155419]: New machine qemu-8-instance-0000000b.
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.462 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.462 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[424f7909-ac98-4556-9edc-feb84b7d0da5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.463 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.466 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.466 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[18a275b5-c45b-4ee1-b0e6-f5758eaaa25f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.467 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[6547090b-97d3-4257-a505-1391ac8149ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-0000000b.
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.478 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[c9b24b38-5a69-42d6-b651-edf86ec66426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.501 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8d30aa-1024-4065-88de-48ed2d133ec9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.530 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[6385da4c-d06a-456f-9ccc-1d613b029aa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 NetworkManager[56388]: <info>  [1771248955.5388] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.538 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[74d541b4-809b-4df9-ac7e-5297599e6191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.566 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[25f8a138-3358-4b8b-8b3e-fb203c6fd46d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.570 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[c53d9fb3-d83b-4ad9-a3de-579d7fc5d6bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 NetworkManager[56388]: <info>  [1771248955.5932] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.596 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[cfd92f4f-ec1a-4b72-9f75-261de7b08c9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.612 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[2fc94ec2-d071-4eb0-87cf-480413e7e1ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504001, 'reachable_time': 22561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210850, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.626 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[060bd52c-7d2c-4cc3-b744-b52d43737d2d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504001, 'tstamp': 504001}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210851, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.640 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[145e68b9-2767-425d-9c4e-9d8732a334cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504001, 'reachable_time': 22561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210852, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.673 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[6e9fb170-98dc-43b3-897c-f7468332ea87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.723 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[26c03543-04af-4223-8a20-9d4210068341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.725 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.725 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.726 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.727 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:55 compute-1 NetworkManager[56388]: <info>  [1771248955.7285] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Feb 16 13:35:55 compute-1 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.733 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.734 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.736 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:55 compute-1 ovn_controller[96285]: 2026-02-16T13:35:55Z|00095|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.739 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.740 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a66925ee-742c-4b29-b38c-76e4709b9566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.741 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.742 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:35:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:35:55.744 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.773 185914 DEBUG nova.compute.manager [req-f3fa6250-7149-48e5-9c37-1fa9c479100e req-6d0492eb-2403-49b4-8c10-3150885a4764 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.774 185914 DEBUG oslo_concurrency.lockutils [req-f3fa6250-7149-48e5-9c37-1fa9c479100e req-6d0492eb-2403-49b4-8c10-3150885a4764 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.774 185914 DEBUG oslo_concurrency.lockutils [req-f3fa6250-7149-48e5-9c37-1fa9c479100e req-6d0492eb-2403-49b4-8c10-3150885a4764 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.775 185914 DEBUG oslo_concurrency.lockutils [req-f3fa6250-7149-48e5-9c37-1fa9c479100e req-6d0492eb-2403-49b4-8c10-3150885a4764 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.775 185914 DEBUG nova.compute.manager [req-f3fa6250-7149-48e5-9c37-1fa9c479100e req-6d0492eb-2403-49b4-8c10-3150885a4764 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Processing event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.803 185914 DEBUG nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.804 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248955.8025029, fea12b84-b444-4299-a1d9-2e974fbb93e0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.804 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] VM Started (Lifecycle Event)
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.809 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.812 185914 INFO nova.virt.libvirt.driver [-] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Instance spawned successfully.
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.813 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.837 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.845 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.851 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.851 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.852 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.852 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.852 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.853 185914 DEBUG nova.virt.libvirt.driver [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.912 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.912 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248955.8063087, fea12b84-b444-4299-a1d9-2e974fbb93e0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.912 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] VM Paused (Lifecycle Event)
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.944 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.949 185914 INFO nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Took 5.45 seconds to spawn the instance on the hypervisor.
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.950 185914 DEBUG nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.951 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771248955.8088448, fea12b84-b444-4299-a1d9-2e974fbb93e0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.952 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] VM Resumed (Lifecycle Event)
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.986 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:35:55 compute-1 nova_compute[185910]: 2026-02-16 13:35:55.989 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:35:56 compute-1 nova_compute[185910]: 2026-02-16 13:35:56.021 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:35:56 compute-1 nova_compute[185910]: 2026-02-16 13:35:56.035 185914 INFO nova.compute.manager [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Took 5.98 seconds to build instance.
Feb 16 13:35:56 compute-1 nova_compute[185910]: 2026-02-16 13:35:56.059 185914 DEBUG oslo_concurrency.lockutils [None req-b70d9303-c00a-4c64-9a25-a355fa320ee5 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:56 compute-1 podman[210891]: 2026-02-16 13:35:56.131758715 +0000 UTC m=+0.053451110 container create b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 16 13:35:56 compute-1 systemd[1]: Started libpod-conmon-b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7.scope.
Feb 16 13:35:56 compute-1 nova_compute[185910]: 2026-02-16 13:35:56.166 185914 DEBUG nova.network.neutron [req-df046dd0-d2bd-4419-8ecd-a641dc27bd21 req-cac4bde4-03eb-4107-bdc7-e0c7d4406e56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updated VIF entry in instance network info cache for port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:35:56 compute-1 nova_compute[185910]: 2026-02-16 13:35:56.168 185914 DEBUG nova.network.neutron [req-df046dd0-d2bd-4419-8ecd-a641dc27bd21 req-cac4bde4-03eb-4107-bdc7-e0c7d4406e56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updating instance_info_cache with network_info: [{"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:35:56 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:35:56 compute-1 nova_compute[185910]: 2026-02-16 13:35:56.186 185914 DEBUG oslo_concurrency.lockutils [req-df046dd0-d2bd-4419-8ecd-a641dc27bd21 req-cac4bde4-03eb-4107-bdc7-e0c7d4406e56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:35:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a42db01d9b1de39446005a99ffe88bf3294df7b190f30931fdff2480ead233c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:35:56 compute-1 podman[210891]: 2026-02-16 13:35:56.102531387 +0000 UTC m=+0.024223862 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:35:56 compute-1 podman[210891]: 2026-02-16 13:35:56.205123258 +0000 UTC m=+0.126815683 container init b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:35:56 compute-1 podman[210891]: 2026-02-16 13:35:56.209844537 +0000 UTC m=+0.131536932 container start b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 13:35:56 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210907]: [NOTICE]   (210911) : New worker (210913) forked
Feb 16 13:35:56 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210907]: [NOTICE]   (210911) : Loading success.
Feb 16 13:35:57 compute-1 nova_compute[185910]: 2026-02-16 13:35:57.958 185914 DEBUG nova.compute.manager [req-cf5c194b-4a51-417e-a54a-cae77a320513 req-48914664-93b4-4152-be79-f49dc34c9b25 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:35:57 compute-1 nova_compute[185910]: 2026-02-16 13:35:57.958 185914 DEBUG oslo_concurrency.lockutils [req-cf5c194b-4a51-417e-a54a-cae77a320513 req-48914664-93b4-4152-be79-f49dc34c9b25 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:35:57 compute-1 nova_compute[185910]: 2026-02-16 13:35:57.958 185914 DEBUG oslo_concurrency.lockutils [req-cf5c194b-4a51-417e-a54a-cae77a320513 req-48914664-93b4-4152-be79-f49dc34c9b25 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:35:57 compute-1 nova_compute[185910]: 2026-02-16 13:35:57.959 185914 DEBUG oslo_concurrency.lockutils [req-cf5c194b-4a51-417e-a54a-cae77a320513 req-48914664-93b4-4152-be79-f49dc34c9b25 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:35:57 compute-1 nova_compute[185910]: 2026-02-16 13:35:57.959 185914 DEBUG nova.compute.manager [req-cf5c194b-4a51-417e-a54a-cae77a320513 req-48914664-93b4-4152-be79-f49dc34c9b25 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] No waiting events found dispatching network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:35:57 compute-1 nova_compute[185910]: 2026-02-16 13:35:57.959 185914 WARNING nova.compute.manager [req-cf5c194b-4a51-417e-a54a-cae77a320513 req-48914664-93b4-4152-be79-f49dc34c9b25 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received unexpected event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd for instance with vm_state active and task_state None.
Feb 16 13:35:58 compute-1 sshd-session[210922]: Invalid user apache from 188.166.42.159 port 43810
Feb 16 13:35:58 compute-1 sshd-session[210922]: Connection closed by invalid user apache 188.166.42.159 port 43810 [preauth]
Feb 16 13:35:58 compute-1 nova_compute[185910]: 2026-02-16 13:35:58.692 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:35:59 compute-1 nova_compute[185910]: 2026-02-16 13:35:59.675 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:00 compute-1 podman[210924]: 2026-02-16 13:36:00.163001403 +0000 UTC m=+0.052919456 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:36:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:03.344 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:03.345 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:03.346 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:03 compute-1 nova_compute[185910]: 2026-02-16 13:36:03.695 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:04 compute-1 nova_compute[185910]: 2026-02-16 13:36:04.678 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:05 compute-1 podman[195236]: time="2026-02-16T13:36:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:36:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:36:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:36:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:36:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2638 "" "Go-http-client/1.1"
Feb 16 13:36:07 compute-1 sshd-session[210951]: Invalid user admin from 146.190.226.24 port 42716
Feb 16 13:36:07 compute-1 sshd-session[210951]: Connection closed by invalid user admin 146.190.226.24 port 42716 [preauth]
Feb 16 13:36:08 compute-1 nova_compute[185910]: 2026-02-16 13:36:08.699 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:09 compute-1 sshd-session[210971]: Connection closed by authenticating user root 2.57.122.210 port 58028 [preauth]
Feb 16 13:36:09 compute-1 ovn_controller[96285]: 2026-02-16T13:36:09Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:37:03:5a 10.100.0.14
Feb 16 13:36:09 compute-1 ovn_controller[96285]: 2026-02-16T13:36:09Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:37:03:5a 10.100.0.14
Feb 16 13:36:09 compute-1 nova_compute[185910]: 2026-02-16 13:36:09.680 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:11 compute-1 nova_compute[185910]: 2026-02-16 13:36:11.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:12 compute-1 nova_compute[185910]: 2026-02-16 13:36:12.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:13 compute-1 nova_compute[185910]: 2026-02-16 13:36:13.701 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:14 compute-1 nova_compute[185910]: 2026-02-16 13:36:14.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:14 compute-1 nova_compute[185910]: 2026-02-16 13:36:14.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:14 compute-1 nova_compute[185910]: 2026-02-16 13:36:14.683 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:15 compute-1 nova_compute[185910]: 2026-02-16 13:36:15.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.662 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.663 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.663 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.663 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.737 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.790 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.791 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.853 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.982 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.984 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5641MB free_disk=73.19464874267578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.984 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:16 compute-1 nova_compute[185910]: 2026-02-16 13:36:16.984 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.073 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance fea12b84-b444-4299-a1d9-2e974fbb93e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.073 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.073 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.096 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.173 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.173 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.186 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.214 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.261 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.280 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.321 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:36:17 compute-1 nova_compute[185910]: 2026-02-16 13:36:17.322 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:17 compute-1 podman[210980]: 2026-02-16 13:36:17.92280104 +0000 UTC m=+0.055417565 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 16 13:36:17 compute-1 podman[210981]: 2026-02-16 13:36:17.92917465 +0000 UTC m=+0.058615810 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 16 13:36:18 compute-1 nova_compute[185910]: 2026-02-16 13:36:18.703 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:19 compute-1 nova_compute[185910]: 2026-02-16 13:36:19.318 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:19 compute-1 openstack_network_exporter[198096]: ERROR   13:36:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:36:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:36:19 compute-1 openstack_network_exporter[198096]: ERROR   13:36:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:36:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:36:19 compute-1 nova_compute[185910]: 2026-02-16 13:36:19.685 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:21 compute-1 nova_compute[185910]: 2026-02-16 13:36:21.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:21 compute-1 nova_compute[185910]: 2026-02-16 13:36:21.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:36:22 compute-1 nova_compute[185910]: 2026-02-16 13:36:22.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:36:22 compute-1 nova_compute[185910]: 2026-02-16 13:36:22.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:36:22 compute-1 nova_compute[185910]: 2026-02-16 13:36:22.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:36:22 compute-1 podman[211018]: 2026-02-16 13:36:22.97124749 +0000 UTC m=+0.112186264 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:36:23 compute-1 nova_compute[185910]: 2026-02-16 13:36:23.252 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:36:23 compute-1 nova_compute[185910]: 2026-02-16 13:36:23.253 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:36:23 compute-1 nova_compute[185910]: 2026-02-16 13:36:23.253 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:36:23 compute-1 nova_compute[185910]: 2026-02-16 13:36:23.253 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid fea12b84-b444-4299-a1d9-2e974fbb93e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:36:23 compute-1 nova_compute[185910]: 2026-02-16 13:36:23.706 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:24 compute-1 nova_compute[185910]: 2026-02-16 13:36:24.688 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:24 compute-1 nova_compute[185910]: 2026-02-16 13:36:24.998 185914 DEBUG nova.compute.manager [None req-4fd7ca79-1ae5-476f-8aee-6486b2000700 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 63898862-3dd6-49b3-9545-63882243296a in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 13:36:25 compute-1 nova_compute[185910]: 2026-02-16 13:36:25.071 185914 DEBUG nova.compute.provider_tree [None req-4fd7ca79-1ae5-476f-8aee-6486b2000700 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 15 to 16 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:36:25 compute-1 nova_compute[185910]: 2026-02-16 13:36:25.437 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updating instance_info_cache with network_info: [{"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:36:25 compute-1 nova_compute[185910]: 2026-02-16 13:36:25.459 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:36:25 compute-1 nova_compute[185910]: 2026-02-16 13:36:25.460 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:36:28 compute-1 nova_compute[185910]: 2026-02-16 13:36:28.709 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:29 compute-1 nova_compute[185910]: 2026-02-16 13:36:29.691 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:30 compute-1 podman[211045]: 2026-02-16 13:36:30.9049336 +0000 UTC m=+0.048120629 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:36:30 compute-1 nova_compute[185910]: 2026-02-16 13:36:30.984 185914 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Check if temp file /var/lib/nova/instances/tmpc1cj4x3y exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:36:30 compute-1 nova_compute[185910]: 2026-02-16 13:36:30.985 185914 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc1cj4x3y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fea12b84-b444-4299-a1d9-2e974fbb93e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:36:32 compute-1 nova_compute[185910]: 2026-02-16 13:36:32.420 185914 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:32 compute-1 nova_compute[185910]: 2026-02-16 13:36:32.467 185914 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:32 compute-1 nova_compute[185910]: 2026-02-16 13:36:32.468 185914 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:36:32 compute-1 nova_compute[185910]: 2026-02-16 13:36:32.525 185914 DEBUG oslo_concurrency.processutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:36:33 compute-1 nova_compute[185910]: 2026-02-16 13:36:33.713 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:34 compute-1 nova_compute[185910]: 2026-02-16 13:36:34.694 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:35 compute-1 podman[195236]: time="2026-02-16T13:36:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:36:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:36:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:36:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:36:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 16 13:36:35 compute-1 sshd-session[211075]: Accepted publickey for nova from 192.168.122.100 port 40334 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:36:35 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:36:35 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:36:35 compute-1 systemd-logind[821]: New session 36 of user nova.
Feb 16 13:36:35 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:36:35 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:36:35 compute-1 systemd[211079]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:36:36 compute-1 systemd[211079]: Queued start job for default target Main User Target.
Feb 16 13:36:36 compute-1 systemd[211079]: Created slice User Application Slice.
Feb 16 13:36:36 compute-1 systemd[211079]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:36:36 compute-1 systemd[211079]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:36:36 compute-1 systemd[211079]: Reached target Paths.
Feb 16 13:36:36 compute-1 systemd[211079]: Reached target Timers.
Feb 16 13:36:36 compute-1 systemd[211079]: Starting D-Bus User Message Bus Socket...
Feb 16 13:36:36 compute-1 systemd[211079]: Starting Create User's Volatile Files and Directories...
Feb 16 13:36:36 compute-1 systemd[211079]: Finished Create User's Volatile Files and Directories.
Feb 16 13:36:36 compute-1 systemd[211079]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:36:36 compute-1 systemd[211079]: Reached target Sockets.
Feb 16 13:36:36 compute-1 systemd[211079]: Reached target Basic System.
Feb 16 13:36:36 compute-1 systemd[211079]: Reached target Main User Target.
Feb 16 13:36:36 compute-1 systemd[211079]: Startup finished in 128ms.
Feb 16 13:36:36 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:36:36 compute-1 systemd[1]: Started Session 36 of User nova.
Feb 16 13:36:36 compute-1 sshd-session[211075]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:36:36 compute-1 sshd-session[211094]: Received disconnect from 192.168.122.100 port 40334:11: disconnected by user
Feb 16 13:36:36 compute-1 sshd-session[211094]: Disconnected from user nova 192.168.122.100 port 40334
Feb 16 13:36:36 compute-1 sshd-session[211075]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:36:36 compute-1 systemd[1]: session-36.scope: Deactivated successfully.
Feb 16 13:36:36 compute-1 systemd-logind[821]: Session 36 logged out. Waiting for processes to exit.
Feb 16 13:36:36 compute-1 systemd-logind[821]: Removed session 36.
Feb 16 13:36:37 compute-1 ovn_controller[96285]: 2026-02-16T13:36:37Z|00096|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Feb 16 13:36:38 compute-1 nova_compute[185910]: 2026-02-16 13:36:38.577 185914 DEBUG nova.compute.manager [req-0e487dd9-fec2-4144-9e7b-dfb2846e3195 req-39f2facd-d6ee-4a8a-80bf-ce334a8da8ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-unplugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:38 compute-1 nova_compute[185910]: 2026-02-16 13:36:38.578 185914 DEBUG oslo_concurrency.lockutils [req-0e487dd9-fec2-4144-9e7b-dfb2846e3195 req-39f2facd-d6ee-4a8a-80bf-ce334a8da8ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:38 compute-1 nova_compute[185910]: 2026-02-16 13:36:38.578 185914 DEBUG oslo_concurrency.lockutils [req-0e487dd9-fec2-4144-9e7b-dfb2846e3195 req-39f2facd-d6ee-4a8a-80bf-ce334a8da8ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:38 compute-1 nova_compute[185910]: 2026-02-16 13:36:38.578 185914 DEBUG oslo_concurrency.lockutils [req-0e487dd9-fec2-4144-9e7b-dfb2846e3195 req-39f2facd-d6ee-4a8a-80bf-ce334a8da8ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:38 compute-1 nova_compute[185910]: 2026-02-16 13:36:38.579 185914 DEBUG nova.compute.manager [req-0e487dd9-fec2-4144-9e7b-dfb2846e3195 req-39f2facd-d6ee-4a8a-80bf-ce334a8da8ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] No waiting events found dispatching network-vif-unplugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:38 compute-1 nova_compute[185910]: 2026-02-16 13:36:38.579 185914 DEBUG nova.compute.manager [req-0e487dd9-fec2-4144-9e7b-dfb2846e3195 req-39f2facd-d6ee-4a8a-80bf-ce334a8da8ec faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-unplugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:36:38 compute-1 nova_compute[185910]: 2026-02-16 13:36:38.715 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:39.216 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:36:39 compute-1 nova_compute[185910]: 2026-02-16 13:36:39.217 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:39.217 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:36:39 compute-1 nova_compute[185910]: 2026-02-16 13:36:39.696 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.295 185914 INFO nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Took 7.77 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.296 185914 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.324 185914 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc1cj4x3y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fea12b84-b444-4299-a1d9-2e974fbb93e0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(a2ce4074-e70f-4a69-84d1-2d9db78c77b6),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.353 185914 DEBUG nova.objects.instance [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid fea12b84-b444-4299-a1d9-2e974fbb93e0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.354 185914 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.356 185914 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.356 185914 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.389 185914 DEBUG nova.virt.libvirt.vif [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:35:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-276078181',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-276078181',id=11,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:35:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-14c0jw0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:35:56Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=fea12b84-b444-4299-a1d9-2e974fbb93e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.389 185914 DEBUG nova.network.os_vif_util [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.390 185914 DEBUG nova.network.os_vif_util [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.391 185914 DEBUG nova.virt.libvirt.migration [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:36:40 compute-1 nova_compute[185910]:   <mac address="fa:16:3e:37:03:5a"/>
Feb 16 13:36:40 compute-1 nova_compute[185910]:   <model type="virtio"/>
Feb 16 13:36:40 compute-1 nova_compute[185910]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:36:40 compute-1 nova_compute[185910]:   <mtu size="1442"/>
Feb 16 13:36:40 compute-1 nova_compute[185910]:   <target dev="tap0b908a4c-c9"/>
Feb 16 13:36:40 compute-1 nova_compute[185910]: </interface>
Feb 16 13:36:40 compute-1 nova_compute[185910]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.392 185914 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.657 185914 DEBUG nova.compute.manager [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.658 185914 DEBUG oslo_concurrency.lockutils [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.659 185914 DEBUG oslo_concurrency.lockutils [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.659 185914 DEBUG oslo_concurrency.lockutils [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.659 185914 DEBUG nova.compute.manager [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] No waiting events found dispatching network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.659 185914 WARNING nova.compute.manager [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received unexpected event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd for instance with vm_state active and task_state migrating.
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.660 185914 DEBUG nova.compute.manager [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-changed-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.660 185914 DEBUG nova.compute.manager [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Refreshing instance network info cache due to event network-changed-0b908a4c-c96e-4244-b4b4-87f4ac6110bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.660 185914 DEBUG oslo_concurrency.lockutils [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.660 185914 DEBUG oslo_concurrency.lockutils [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.661 185914 DEBUG nova.network.neutron [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Refreshing network info cache for port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.859 185914 DEBUG nova.virt.libvirt.migration [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.860 185914 INFO nova.virt.libvirt.migration [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:36:40 compute-1 nova_compute[185910]: 2026-02-16 13:36:40.925 185914 INFO nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.430 185914 DEBUG nova.virt.libvirt.migration [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.431 185914 DEBUG nova.virt.libvirt.migration [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.646 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249001.646263, fea12b84-b444-4299-a1d9-2e974fbb93e0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.647 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] VM Paused (Lifecycle Event)
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.666 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.671 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.687 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:36:41 compute-1 kernel: tap0b908a4c-c9 (unregistering): left promiscuous mode
Feb 16 13:36:41 compute-1 NetworkManager[56388]: <info>  [1771249001.7950] device (tap0b908a4c-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.796 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:41 compute-1 ovn_controller[96285]: 2026-02-16T13:36:41Z|00097|binding|INFO|Releasing lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd from this chassis (sb_readonly=0)
Feb 16 13:36:41 compute-1 ovn_controller[96285]: 2026-02-16T13:36:41Z|00098|binding|INFO|Setting lport 0b908a4c-c96e-4244-b4b4-87f4ac6110bd down in Southbound
Feb 16 13:36:41 compute-1 ovn_controller[96285]: 2026-02-16T13:36:41Z|00099|binding|INFO|Removing iface tap0b908a4c-c9 ovn-installed in OVS
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.807 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:41.814 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:03:5a 10.100.0.14'], port_security=['fa:16:3e:37:03:5a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fea12b84-b444-4299-a1d9-2e974fbb93e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=0b908a4c-c96e-4244-b4b4-87f4ac6110bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.815 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:41.818 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:36:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:41.820 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:36:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:41.822 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3d866582-8050-42b7-a83b-b94f0da99874]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:41.823 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:36:41 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Feb 16 13:36:41 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000b.scope: Consumed 14.337s CPU time.
Feb 16 13:36:41 compute-1 systemd-machined[155419]: Machine qemu-8-instance-0000000b terminated.
Feb 16 13:36:41 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210907]: [NOTICE]   (210911) : haproxy version is 2.8.14-c23fe91
Feb 16 13:36:41 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210907]: [NOTICE]   (210911) : path to executable is /usr/sbin/haproxy
Feb 16 13:36:41 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210907]: [WARNING]  (210911) : Exiting Master process...
Feb 16 13:36:41 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210907]: [ALERT]    (210911) : Current worker (210913) exited with code 143 (Terminated)
Feb 16 13:36:41 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[210907]: [WARNING]  (210911) : All workers exited. Exiting... (0)
Feb 16 13:36:41 compute-1 systemd[1]: libpod-b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7.scope: Deactivated successfully.
Feb 16 13:36:41 compute-1 podman[211126]: 2026-02-16 13:36:41.98900903 +0000 UTC m=+0.059354900 container died b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 16 13:36:41 compute-1 nova_compute[185910]: 2026-02-16 13:36:41.996 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.001 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7-userdata-shm.mount: Deactivated successfully.
Feb 16 13:36:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-a42db01d9b1de39446005a99ffe88bf3294df7b190f30931fdff2480ead233c6-merged.mount: Deactivated successfully.
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.032 185914 DEBUG nova.virt.libvirt.guest [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.034 185914 INFO nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Migration operation has completed
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.034 185914 INFO nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] _post_live_migration() is started..
Feb 16 13:36:42 compute-1 podman[211126]: 2026-02-16 13:36:42.041612148 +0000 UTC m=+0.111958008 container cleanup b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.042 185914 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.043 185914 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.043 185914 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:36:42 compute-1 systemd[1]: libpod-conmon-b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7.scope: Deactivated successfully.
Feb 16 13:36:42 compute-1 podman[211169]: 2026-02-16 13:36:42.1122642 +0000 UTC m=+0.048893620 container remove b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 16 13:36:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:42.118 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3e8f05b1-69eb-4ba4-b41f-39e2cfe112de]: (4, ('Mon Feb 16 01:36:41 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7)\nb22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7\nMon Feb 16 01:36:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (b22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7)\nb22e277d4c1b3f53c759b65a45d083470a769d1beb0d97c922ce669e6b4646c7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:42.121 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[efe9760d-2624-4bb5-85d7-9f26fc14c543]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:42.122 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.124 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:42 compute-1 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.133 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:42.138 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[cd56a87a-bf2e-4d03-9830-db14b2777dd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:42.160 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a65cb9-69bb-42e7-9dd4-faa4fe7ab26a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:42.161 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[df4aaef9-df0f-4632-b6d5-dfb54ff618a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:42.175 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[faccd54d-2543-4c95-b3f0-802eacfbedd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503994, 'reachable_time': 18052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211188, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-1 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:36:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:42.177 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:36:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:42.177 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[97fb73cf-a8c1-4426-a299-96cb550b7d52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.783 185914 DEBUG nova.network.neutron [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.784 185914 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.785 185914 DEBUG nova.virt.libvirt.vif [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:35:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-276078181',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-276078181',id=11,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:35:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-14c0jw0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:36:27Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=fea12b84-b444-4299-a1d9-2e974fbb93e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.785 185914 DEBUG nova.network.os_vif_util [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.786 185914 DEBUG nova.network.os_vif_util [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.787 185914 DEBUG os_vif [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.789 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.790 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b908a4c-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.792 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.794 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.797 185914 INFO os_vif [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:03:5a,bridge_name='br-int',has_traffic_filtering=True,id=0b908a4c-c96e-4244-b4b4-87f4ac6110bd,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b908a4c-c9')
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.797 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.798 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.798 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.798 185914 DEBUG nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.799 185914 INFO nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Deleting instance files /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0_del
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.799 185914 INFO nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Deletion of /var/lib/nova/instances/fea12b84-b444-4299-a1d9-2e974fbb93e0_del complete
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.807 185914 DEBUG nova.compute.manager [req-ad28bb9f-4724-40b4-ad08-64e1b1727120 req-9c25e550-79c4-4b33-b16e-666e9151438e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-unplugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.808 185914 DEBUG oslo_concurrency.lockutils [req-ad28bb9f-4724-40b4-ad08-64e1b1727120 req-9c25e550-79c4-4b33-b16e-666e9151438e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.808 185914 DEBUG oslo_concurrency.lockutils [req-ad28bb9f-4724-40b4-ad08-64e1b1727120 req-9c25e550-79c4-4b33-b16e-666e9151438e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.808 185914 DEBUG oslo_concurrency.lockutils [req-ad28bb9f-4724-40b4-ad08-64e1b1727120 req-9c25e550-79c4-4b33-b16e-666e9151438e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.809 185914 DEBUG nova.compute.manager [req-ad28bb9f-4724-40b4-ad08-64e1b1727120 req-9c25e550-79c4-4b33-b16e-666e9151438e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] No waiting events found dispatching network-vif-unplugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:42 compute-1 nova_compute[185910]: 2026-02-16 13:36:42.809 185914 DEBUG nova.compute.manager [req-ad28bb9f-4724-40b4-ad08-64e1b1727120 req-9c25e550-79c4-4b33-b16e-666e9151438e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-unplugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:36:43 compute-1 nova_compute[185910]: 2026-02-16 13:36:43.479 185914 DEBUG nova.network.neutron [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updated VIF entry in instance network info cache for port 0b908a4c-c96e-4244-b4b4-87f4ac6110bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:36:43 compute-1 nova_compute[185910]: 2026-02-16 13:36:43.480 185914 DEBUG nova.network.neutron [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Updating instance_info_cache with network_info: [{"id": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "address": "fa:16:3e:37:03:5a", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b908a4c-c9", "ovs_interfaceid": "0b908a4c-c96e-4244-b4b4-87f4ac6110bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:36:43 compute-1 nova_compute[185910]: 2026-02-16 13:36:43.511 185914 DEBUG oslo_concurrency.lockutils [req-ed7d0114-013f-4cc1-bf83-1e50f1c2ada2 req-3bf61152-bb8e-423e-b3bf-3d5d24037b3d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-fea12b84-b444-4299-a1d9-2e974fbb93e0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:36:43 compute-1 nova_compute[185910]: 2026-02-16 13:36:43.749 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.941 185914 DEBUG nova.compute.manager [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.941 185914 DEBUG oslo_concurrency.lockutils [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.942 185914 DEBUG oslo_concurrency.lockutils [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.942 185914 DEBUG oslo_concurrency.lockutils [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.942 185914 DEBUG nova.compute.manager [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] No waiting events found dispatching network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.943 185914 WARNING nova.compute.manager [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received unexpected event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd for instance with vm_state active and task_state migrating.
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.943 185914 DEBUG nova.compute.manager [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.943 185914 DEBUG oslo_concurrency.lockutils [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.944 185914 DEBUG oslo_concurrency.lockutils [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.944 185914 DEBUG oslo_concurrency.lockutils [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.944 185914 DEBUG nova.compute.manager [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] No waiting events found dispatching network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.945 185914 WARNING nova.compute.manager [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received unexpected event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd for instance with vm_state active and task_state migrating.
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.945 185914 DEBUG nova.compute.manager [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.945 185914 DEBUG oslo_concurrency.lockutils [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.946 185914 DEBUG oslo_concurrency.lockutils [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.946 185914 DEBUG oslo_concurrency.lockutils [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.946 185914 DEBUG nova.compute.manager [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] No waiting events found dispatching network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:36:44 compute-1 nova_compute[185910]: 2026-02-16 13:36:44.947 185914 WARNING nova.compute.manager [req-2311d466-cd45-4849-810c-00dfc2b4ba7d req-621bfca5-ea10-4add-87b7-17a65cf69ffa faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Received unexpected event network-vif-plugged-0b908a4c-c96e-4244-b4b4-87f4ac6110bd for instance with vm_state active and task_state migrating.
Feb 16 13:36:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:36:46.219 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:36:46 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:36:46 compute-1 systemd[211079]: Activating special unit Exit the Session...
Feb 16 13:36:46 compute-1 systemd[211079]: Stopped target Main User Target.
Feb 16 13:36:46 compute-1 systemd[211079]: Stopped target Basic System.
Feb 16 13:36:46 compute-1 systemd[211079]: Stopped target Paths.
Feb 16 13:36:46 compute-1 systemd[211079]: Stopped target Sockets.
Feb 16 13:36:46 compute-1 systemd[211079]: Stopped target Timers.
Feb 16 13:36:46 compute-1 systemd[211079]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:36:46 compute-1 systemd[211079]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:36:46 compute-1 systemd[211079]: Closed D-Bus User Message Bus Socket.
Feb 16 13:36:46 compute-1 systemd[211079]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:36:46 compute-1 systemd[211079]: Removed slice User Application Slice.
Feb 16 13:36:46 compute-1 systemd[211079]: Reached target Shutdown.
Feb 16 13:36:46 compute-1 systemd[211079]: Finished Exit the Session.
Feb 16 13:36:46 compute-1 systemd[211079]: Reached target Exit the Session.
Feb 16 13:36:46 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:36:46 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:36:46 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:36:46 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:36:46 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:36:46 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:36:46 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:36:47 compute-1 nova_compute[185910]: 2026-02-16 13:36:47.794 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:48 compute-1 nova_compute[185910]: 2026-02-16 13:36:48.753 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:48 compute-1 podman[211193]: 2026-02-16 13:36:48.91478731 +0000 UTC m=+0.053990247 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Feb 16 13:36:48 compute-1 podman[211192]: 2026-02-16 13:36:48.920126462 +0000 UTC m=+0.059572835 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 16 13:36:49 compute-1 openstack_network_exporter[198096]: ERROR   13:36:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:36:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:36:49 compute-1 openstack_network_exporter[198096]: ERROR   13:36:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:36:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.544 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.545 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.545 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "fea12b84-b444-4299-a1d9-2e974fbb93e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.564 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.565 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.565 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.565 185914 DEBUG nova.compute.resource_tracker [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.694 185914 WARNING nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.695 185914 DEBUG nova.compute.resource_tracker [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5793MB free_disk=73.22367477416992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.696 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.696 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.753 185914 DEBUG nova.compute.resource_tracker [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance fea12b84-b444-4299-a1d9-2e974fbb93e0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.779 185914 DEBUG nova.compute.resource_tracker [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.826 185914 DEBUG nova.compute.resource_tracker [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration a2ce4074-e70f-4a69-84d1-2d9db78c77b6 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.827 185914 DEBUG nova.compute.resource_tracker [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.827 185914 DEBUG nova.compute.resource_tracker [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.871 185914 DEBUG nova.compute.provider_tree [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.888 185914 DEBUG nova.scheduler.client.report [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.916 185914 DEBUG nova.compute.resource_tracker [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.917 185914 DEBUG oslo_concurrency.lockutils [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:36:49 compute-1 nova_compute[185910]: 2026-02-16 13:36:49.922 185914 INFO nova.compute.manager [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Feb 16 13:36:50 compute-1 nova_compute[185910]: 2026-02-16 13:36:50.002 185914 INFO nova.scheduler.client.report [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration a2ce4074-e70f-4a69-84d1-2d9db78c77b6
Feb 16 13:36:50 compute-1 nova_compute[185910]: 2026-02-16 13:36:50.003 185914 DEBUG nova.virt.libvirt.driver [None req-8dd25858-979a-4e7c-8f1c-508c6184b1d9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:36:52 compute-1 sshd-session[211233]: Invalid user postgres from 188.166.42.159 port 60596
Feb 16 13:36:52 compute-1 sshd-session[211233]: Connection closed by invalid user postgres 188.166.42.159 port 60596 [preauth]
Feb 16 13:36:52 compute-1 nova_compute[185910]: 2026-02-16 13:36:52.796 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:53 compute-1 nova_compute[185910]: 2026-02-16 13:36:53.754 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:53 compute-1 podman[211235]: 2026-02-16 13:36:53.923482347 +0000 UTC m=+0.068788392 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:36:57 compute-1 nova_compute[185910]: 2026-02-16 13:36:57.033 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249002.0317755, fea12b84-b444-4299-a1d9-2e974fbb93e0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:36:57 compute-1 nova_compute[185910]: 2026-02-16 13:36:57.034 185914 INFO nova.compute.manager [-] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] VM Stopped (Lifecycle Event)
Feb 16 13:36:57 compute-1 nova_compute[185910]: 2026-02-16 13:36:57.143 185914 DEBUG nova.compute.manager [None req-0e558e98-d6da-4050-986a-d97baefc010f - - - - - -] [instance: fea12b84-b444-4299-a1d9-2e974fbb93e0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:36:57 compute-1 nova_compute[185910]: 2026-02-16 13:36:57.834 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:36:58 compute-1 nova_compute[185910]: 2026-02-16 13:36:58.758 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:01 compute-1 podman[211261]: 2026-02-16 13:37:01.918871721 +0000 UTC m=+0.056815892 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:37:02 compute-1 nova_compute[185910]: 2026-02-16 13:37:02.837 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:03.344 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:03.345 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:03.345 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:03 compute-1 nova_compute[185910]: 2026-02-16 13:37:03.759 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:05 compute-1 podman[195236]: time="2026-02-16T13:37:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:37:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:37:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:37:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:37:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Feb 16 13:37:07 compute-1 nova_compute[185910]: 2026-02-16 13:37:07.840 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:08 compute-1 nova_compute[185910]: 2026-02-16 13:37:08.762 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:12 compute-1 nova_compute[185910]: 2026-02-16 13:37:12.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:12 compute-1 nova_compute[185910]: 2026-02-16 13:37:12.841 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:13 compute-1 nova_compute[185910]: 2026-02-16 13:37:13.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:13 compute-1 nova_compute[185910]: 2026-02-16 13:37:13.792 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:14 compute-1 sshd-session[211288]: Invalid user admin from 146.190.226.24 port 42674
Feb 16 13:37:15 compute-1 sshd-session[211288]: Connection closed by invalid user admin 146.190.226.24 port 42674 [preauth]
Feb 16 13:37:15 compute-1 nova_compute[185910]: 2026-02-16 13:37:15.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.664 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.665 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.665 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.665 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.826 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.828 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5813MB free_disk=73.22367477416992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.828 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.828 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.904 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.905 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.926 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.944 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.945 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:37:16 compute-1 nova_compute[185910]: 2026-02-16 13:37:16.945 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:17 compute-1 nova_compute[185910]: 2026-02-16 13:37:17.843 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:17 compute-1 nova_compute[185910]: 2026-02-16 13:37:17.855 185914 DEBUG nova.compute.manager [None req-999e4f77-f957-4bcd-9bf4-2c983fae9bc4 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 63898862-3dd6-49b3-9545-63882243296a in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 13:37:17 compute-1 nova_compute[185910]: 2026-02-16 13:37:17.916 185914 DEBUG nova.compute.provider_tree [None req-999e4f77-f957-4bcd-9bf4-2c983fae9bc4 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 16 to 19 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:37:18 compute-1 nova_compute[185910]: 2026-02-16 13:37:18.794 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:19 compute-1 openstack_network_exporter[198096]: ERROR   13:37:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:37:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:37:19 compute-1 openstack_network_exporter[198096]: ERROR   13:37:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:37:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:37:19 compute-1 podman[211291]: 2026-02-16 13:37:19.904083321 +0000 UTC m=+0.043966758 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Feb 16 13:37:19 compute-1 podman[211290]: 2026-02-16 13:37:19.908956851 +0000 UTC m=+0.050370269 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, release=1770267347, version=9.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc.)
Feb 16 13:37:20 compute-1 nova_compute[185910]: 2026-02-16 13:37:20.941 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:22 compute-1 nova_compute[185910]: 2026-02-16 13:37:22.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:22 compute-1 nova_compute[185910]: 2026-02-16 13:37:22.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:37:22 compute-1 nova_compute[185910]: 2026-02-16 13:37:22.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:37:22 compute-1 nova_compute[185910]: 2026-02-16 13:37:22.650 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:37:22 compute-1 nova_compute[185910]: 2026-02-16 13:37:22.650 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:22 compute-1 nova_compute[185910]: 2026-02-16 13:37:22.651 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:37:22 compute-1 nova_compute[185910]: 2026-02-16 13:37:22.845 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:23 compute-1 nova_compute[185910]: 2026-02-16 13:37:23.797 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:24 compute-1 podman[211328]: 2026-02-16 13:37:24.92888191 +0000 UTC m=+0.067806006 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 13:37:26 compute-1 nova_compute[185910]: 2026-02-16 13:37:26.646 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:37:27 compute-1 nova_compute[185910]: 2026-02-16 13:37:27.848 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:28 compute-1 nova_compute[185910]: 2026-02-16 13:37:28.799 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:29 compute-1 ovn_controller[96285]: 2026-02-16T13:37:29Z|00100|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.075 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.076 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.093 185914 DEBUG nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.160 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.161 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.168 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.168 185914 INFO nova.compute.claims [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.290 185914 DEBUG nova.compute.provider_tree [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.312 185914 DEBUG nova.scheduler.client.report [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.340 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.340 185914 DEBUG nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.396 185914 DEBUG nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.397 185914 DEBUG nova.network.neutron [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.419 185914 INFO nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.439 185914 DEBUG nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.525 185914 DEBUG nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.527 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.527 185914 INFO nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Creating image(s)
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.528 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.528 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.529 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.548 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.592 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.594 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.595 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.613 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.659 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.660 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.689 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.691 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.692 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.742 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.743 185914 DEBUG nova.virt.disk.api [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.744 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.785 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.786 185914 DEBUG nova.virt.disk.api [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.786 185914 DEBUG nova.objects.instance [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.800 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.801 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Ensure instance console log exists: /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.801 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.802 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.802 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:32 compute-1 nova_compute[185910]: 2026-02-16 13:37:32.850 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:32 compute-1 podman[211369]: 2026-02-16 13:37:32.904243148 +0000 UTC m=+0.047447931 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:37:33 compute-1 nova_compute[185910]: 2026-02-16 13:37:33.595 185914 DEBUG nova.policy [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:37:33 compute-1 nova_compute[185910]: 2026-02-16 13:37:33.801 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:35 compute-1 nova_compute[185910]: 2026-02-16 13:37:35.453 185914 DEBUG nova.network.neutron [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Successfully created port: 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:37:35 compute-1 podman[195236]: time="2026-02-16T13:37:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:37:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:37:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:37:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:37:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 16 13:37:37 compute-1 nova_compute[185910]: 2026-02-16 13:37:37.381 185914 DEBUG nova.network.neutron [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Successfully updated port: 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:37:37 compute-1 nova_compute[185910]: 2026-02-16 13:37:37.400 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:37:37 compute-1 nova_compute[185910]: 2026-02-16 13:37:37.401 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:37:37 compute-1 nova_compute[185910]: 2026-02-16 13:37:37.401 185914 DEBUG nova.network.neutron [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:37:37 compute-1 nova_compute[185910]: 2026-02-16 13:37:37.494 185914 DEBUG nova.compute.manager [req-d4975b42-ff6a-4bf0-95f3-9c087b2dc81c req-c9a7a429-5ec4-499d-8ed4-2114e121a504 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-changed-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:37:37 compute-1 nova_compute[185910]: 2026-02-16 13:37:37.495 185914 DEBUG nova.compute.manager [req-d4975b42-ff6a-4bf0-95f3-9c087b2dc81c req-c9a7a429-5ec4-499d-8ed4-2114e121a504 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Refreshing instance network info cache due to event network-changed-6eb3ffb6-7a82-44c5-98d8-1fa609426d92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:37:37 compute-1 nova_compute[185910]: 2026-02-16 13:37:37.495 185914 DEBUG oslo_concurrency.lockutils [req-d4975b42-ff6a-4bf0-95f3-9c087b2dc81c req-c9a7a429-5ec4-499d-8ed4-2114e121a504 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:37:37 compute-1 nova_compute[185910]: 2026-02-16 13:37:37.595 185914 DEBUG nova.network.neutron [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:37:37 compute-1 nova_compute[185910]: 2026-02-16 13:37:37.891 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:38 compute-1 nova_compute[185910]: 2026-02-16 13:37:38.803 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.343 185914 DEBUG nova.network.neutron [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updating instance_info_cache with network_info: [{"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.368 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.368 185914 DEBUG nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Instance network_info: |[{"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.369 185914 DEBUG oslo_concurrency.lockutils [req-d4975b42-ff6a-4bf0-95f3-9c087b2dc81c req-c9a7a429-5ec4-499d-8ed4-2114e121a504 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.369 185914 DEBUG nova.network.neutron [req-d4975b42-ff6a-4bf0-95f3-9c087b2dc81c req-c9a7a429-5ec4-499d-8ed4-2114e121a504 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Refreshing network info cache for port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.372 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Start _get_guest_xml network_info=[{"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.377 185914 WARNING nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.383 185914 DEBUG nova.virt.libvirt.host [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.384 185914 DEBUG nova.virt.libvirt.host [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.391 185914 DEBUG nova.virt.libvirt.host [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.392 185914 DEBUG nova.virt.libvirt.host [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.394 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.394 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.394 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.395 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.395 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.395 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.395 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.396 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.396 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.396 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.397 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.397 185914 DEBUG nova.virt.hardware [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.401 185914 DEBUG nova.virt.libvirt.vif [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:37:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-931541268',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-931541268',id=13,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-3jycgte6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:37:32Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3c7e1337-03a5-4860-9bdf-2ff0df92ca75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.401 185914 DEBUG nova.network.os_vif_util [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.402 185914 DEBUG nova.network.os_vif_util [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.403 185914 DEBUG nova.objects.instance [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.418 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <uuid>3c7e1337-03a5-4860-9bdf-2ff0df92ca75</uuid>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <name>instance-0000000d</name>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteStrategies-server-931541268</nova:name>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:37:39</nova:creationTime>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:37:39 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:37:39 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:37:39 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:37:39 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:37:39 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:37:39 compute-1 nova_compute[185910]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:37:39 compute-1 nova_compute[185910]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:37:39 compute-1 nova_compute[185910]:         <nova:port uuid="6eb3ffb6-7a82-44c5-98d8-1fa609426d92">
Feb 16 13:37:39 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <system>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <entry name="serial">3c7e1337-03a5-4860-9bdf-2ff0df92ca75</entry>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <entry name="uuid">3c7e1337-03a5-4860-9bdf-2ff0df92ca75</entry>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     </system>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <os>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   </os>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <features>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   </features>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:7c:9b:d1"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <target dev="tap6eb3ffb6-7a"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/console.log" append="off"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <video>
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     </video>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:37:39 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:37:39 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:37:39 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:37:39 compute-1 nova_compute[185910]: </domain>
Feb 16 13:37:39 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.419 185914 DEBUG nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Preparing to wait for external event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.419 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.419 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.419 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.420 185914 DEBUG nova.virt.libvirt.vif [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:37:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-931541268',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-931541268',id=13,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-3jycgte6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:37:32Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3c7e1337-03a5-4860-9bdf-2ff0df92ca75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.420 185914 DEBUG nova.network.os_vif_util [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.421 185914 DEBUG nova.network.os_vif_util [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.422 185914 DEBUG os_vif [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.422 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.423 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.423 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.427 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.427 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6eb3ffb6-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.428 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6eb3ffb6-7a, col_values=(('external_ids', {'iface-id': '6eb3ffb6-7a82-44c5-98d8-1fa609426d92', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:9b:d1', 'vm-uuid': '3c7e1337-03a5-4860-9bdf-2ff0df92ca75'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.429 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:39 compute-1 NetworkManager[56388]: <info>  [1771249059.4307] manager: (tap6eb3ffb6-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.433 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.435 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.435 185914 INFO os_vif [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a')
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.481 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.481 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.481 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:7c:9b:d1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:37:39 compute-1 nova_compute[185910]: 2026-02-16 13:37:39.482 185914 INFO nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Using config drive
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.381 185914 INFO nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Creating config drive at /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.385 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjp46bwtp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.509 185914 DEBUG oslo_concurrency.processutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjp46bwtp" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:37:41 compute-1 kernel: tap6eb3ffb6-7a: entered promiscuous mode
Feb 16 13:37:41 compute-1 NetworkManager[56388]: <info>  [1771249061.5733] manager: (tap6eb3ffb6-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.575 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:41 compute-1 ovn_controller[96285]: 2026-02-16T13:37:41Z|00101|binding|INFO|Claiming lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for this chassis.
Feb 16 13:37:41 compute-1 ovn_controller[96285]: 2026-02-16T13:37:41Z|00102|binding|INFO|6eb3ffb6-7a82-44c5-98d8-1fa609426d92: Claiming fa:16:3e:7c:9b:d1 10.100.0.11
Feb 16 13:37:41 compute-1 ovn_controller[96285]: 2026-02-16T13:37:41Z|00103|binding|INFO|Setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 ovn-installed in OVS
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.581 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.583 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:41 compute-1 ovn_controller[96285]: 2026-02-16T13:37:41Z|00104|binding|INFO|Setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 up in Southbound
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.585 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.586 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:9b:d1 10.100.0.11'], port_security=['fa:16:3e:7c:9b:d1 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3c7e1337-03a5-4860-9bdf-2ff0df92ca75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=6eb3ffb6-7a82-44c5-98d8-1fa609426d92) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.587 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.588 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.596 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[db420f88-0b7e-498a-8e11-fd46e7116e90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.597 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.599 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.599 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1d18fb-c39c-4774-94a4-0acfff1af19a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.600 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[26c5d484-6974-49c2-9112-0da20876609f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 systemd-udevd[211414]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:37:41 compute-1 systemd-machined[155419]: New machine qemu-9-instance-0000000d.
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.609 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[5348380b-6fcf-4897-af4d-12b9d500d9e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 NetworkManager[56388]: <info>  [1771249061.6114] device (tap6eb3ffb6-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:37:41 compute-1 NetworkManager[56388]: <info>  [1771249061.6120] device (tap6eb3ffb6-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:37:41 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-0000000d.
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.619 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c31e8879-b98a-4bdb-99e4-8eba5fb5ac5e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.645 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[9a377ee8-dad5-4edb-b22a-c3d758a1d663]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.651 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5f3c39-9c84-4a96-8e31-6a058de7aa86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 systemd-udevd[211417]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:37:41 compute-1 NetworkManager[56388]: <info>  [1771249061.6527] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.679 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4dc184-0e23-42c2-aa96-adebff2db754]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.683 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[5e87f53a-f129-4282-a008-c854db978673]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 NetworkManager[56388]: <info>  [1771249061.7059] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.711 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[2f611c83-a70c-4879-b72a-4e841a148568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.730 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1f3e874c-f8f4-4d25-a310-3c68f2d06418]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514612, 'reachable_time': 36978, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211446, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.744 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[900af446-55bb-4190-8c5c-95781677475e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 514612, 'tstamp': 514612}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211447, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.762 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac886ed-01ab-4fb5-baad-d960c1181dd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514612, 'reachable_time': 36978, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211450, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.790 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[050cee01-1baa-4bd4-be82-35b4fcbbdee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.835 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb14f78-6ebc-474e-bb40-c1928136ebbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.836 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249061.8363962, 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.837 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] VM Started (Lifecycle Event)
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.837 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.837 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.838 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.839 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:41 compute-1 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.841 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:41 compute-1 NetworkManager[56388]: <info>  [1771249061.8414] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.841 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.843 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:41 compute-1 ovn_controller[96285]: 2026-02-16T13:37:41Z|00105|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.843 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.844 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.845 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[be813f62-5c8d-4f9e-bc96-c85dc8063862]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.846 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:37:41 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:41.846 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.847 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.856 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.860 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249061.8393192, 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.861 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] VM Paused (Lifecycle Event)
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.883 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.887 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:37:41 compute-1 nova_compute[185910]: 2026-02-16 13:37:41.909 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:37:42 compute-1 podman[211487]: 2026-02-16 13:37:42.171676025 +0000 UTC m=+0.046782853 container create 6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 16 13:37:42 compute-1 systemd[1]: Started libpod-conmon-6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94.scope.
Feb 16 13:37:42 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:37:42 compute-1 podman[211487]: 2026-02-16 13:37:42.145763282 +0000 UTC m=+0.020870140 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:37:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f657aa004c8c34b72561fd329d8955f48c44333db96f3fff724d2eb1187f5c65/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:37:42 compute-1 podman[211487]: 2026-02-16 13:37:42.258411937 +0000 UTC m=+0.133518805 container init 6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:37:42 compute-1 podman[211487]: 2026-02-16 13:37:42.266223716 +0000 UTC m=+0.141330554 container start 6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:37:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211503]: [NOTICE]   (211507) : New worker (211509) forked
Feb 16 13:37:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211503]: [NOTICE]   (211507) : Loading success.
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.565 185914 DEBUG nova.compute.manager [req-979d1228-d177-4090-8667-ee425b0bc166 req-02288566-e160-41e3-87ed-95961a1301b6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.566 185914 DEBUG oslo_concurrency.lockutils [req-979d1228-d177-4090-8667-ee425b0bc166 req-02288566-e160-41e3-87ed-95961a1301b6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.566 185914 DEBUG oslo_concurrency.lockutils [req-979d1228-d177-4090-8667-ee425b0bc166 req-02288566-e160-41e3-87ed-95961a1301b6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.566 185914 DEBUG oslo_concurrency.lockutils [req-979d1228-d177-4090-8667-ee425b0bc166 req-02288566-e160-41e3-87ed-95961a1301b6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.567 185914 DEBUG nova.compute.manager [req-979d1228-d177-4090-8667-ee425b0bc166 req-02288566-e160-41e3-87ed-95961a1301b6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Processing event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.567 185914 DEBUG nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.571 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249062.5715835, 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.572 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] VM Resumed (Lifecycle Event)
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.573 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.577 185914 INFO nova.virt.libvirt.driver [-] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Instance spawned successfully.
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.577 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.605 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.609 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.610 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.610 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.610 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.611 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.611 185914 DEBUG nova.virt.libvirt.driver [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.614 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.642 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.675 185914 INFO nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Took 10.15 seconds to spawn the instance on the hypervisor.
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.676 185914 DEBUG nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:37:42 compute-1 nova_compute[185910]: 2026-02-16 13:37:42.779 185914 INFO nova.compute.manager [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Took 10.64 seconds to build instance.
Feb 16 13:37:43 compute-1 nova_compute[185910]: 2026-02-16 13:37:43.113 185914 DEBUG oslo_concurrency.lockutils [None req-bf3ad799-e7a3-451d-b028-6c4f360d6c49 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:43 compute-1 nova_compute[185910]: 2026-02-16 13:37:43.359 185914 DEBUG nova.network.neutron [req-d4975b42-ff6a-4bf0-95f3-9c087b2dc81c req-c9a7a429-5ec4-499d-8ed4-2114e121a504 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updated VIF entry in instance network info cache for port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:37:43 compute-1 nova_compute[185910]: 2026-02-16 13:37:43.359 185914 DEBUG nova.network.neutron [req-d4975b42-ff6a-4bf0-95f3-9c087b2dc81c req-c9a7a429-5ec4-499d-8ed4-2114e121a504 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updating instance_info_cache with network_info: [{"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:37:43 compute-1 nova_compute[185910]: 2026-02-16 13:37:43.375 185914 DEBUG oslo_concurrency.lockutils [req-d4975b42-ff6a-4bf0-95f3-9c087b2dc81c req-c9a7a429-5ec4-499d-8ed4-2114e121a504 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:37:43 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:43.439 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:37:43 compute-1 nova_compute[185910]: 2026-02-16 13:37:43.439 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:43 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:43.440 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:37:43 compute-1 nova_compute[185910]: 2026-02-16 13:37:43.804 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:44 compute-1 nova_compute[185910]: 2026-02-16 13:37:44.430 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:44 compute-1 nova_compute[185910]: 2026-02-16 13:37:44.664 185914 DEBUG nova.compute.manager [req-d854481a-eb7d-4c0f-b110-3d3075151a75 req-08e6e534-ce06-4af3-84bc-b119abaf44bc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:37:44 compute-1 nova_compute[185910]: 2026-02-16 13:37:44.665 185914 DEBUG oslo_concurrency.lockutils [req-d854481a-eb7d-4c0f-b110-3d3075151a75 req-08e6e534-ce06-4af3-84bc-b119abaf44bc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:37:44 compute-1 nova_compute[185910]: 2026-02-16 13:37:44.666 185914 DEBUG oslo_concurrency.lockutils [req-d854481a-eb7d-4c0f-b110-3d3075151a75 req-08e6e534-ce06-4af3-84bc-b119abaf44bc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:37:44 compute-1 nova_compute[185910]: 2026-02-16 13:37:44.666 185914 DEBUG oslo_concurrency.lockutils [req-d854481a-eb7d-4c0f-b110-3d3075151a75 req-08e6e534-ce06-4af3-84bc-b119abaf44bc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:37:44 compute-1 nova_compute[185910]: 2026-02-16 13:37:44.666 185914 DEBUG nova.compute.manager [req-d854481a-eb7d-4c0f-b110-3d3075151a75 req-08e6e534-ce06-4af3-84bc-b119abaf44bc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:37:44 compute-1 nova_compute[185910]: 2026-02-16 13:37:44.667 185914 WARNING nova.compute.manager [req-d854481a-eb7d-4c0f-b110-3d3075151a75 req-08e6e534-ce06-4af3-84bc-b119abaf44bc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received unexpected event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with vm_state active and task_state None.
Feb 16 13:37:45 compute-1 sshd-session[211518]: Invalid user postgres from 188.166.42.159 port 44266
Feb 16 13:37:45 compute-1 sshd-session[211518]: Connection closed by invalid user postgres 188.166.42.159 port 44266 [preauth]
Feb 16 13:37:48 compute-1 nova_compute[185910]: 2026-02-16 13:37:48.807 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:49 compute-1 openstack_network_exporter[198096]: ERROR   13:37:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:37:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:37:49 compute-1 openstack_network_exporter[198096]: ERROR   13:37:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:37:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:37:49 compute-1 nova_compute[185910]: 2026-02-16 13:37:49.431 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:37:50.442 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:37:50 compute-1 podman[211520]: 2026-02-16 13:37:50.921796855 +0000 UTC m=+0.059945356 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, distribution-scope=public, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Feb 16 13:37:50 compute-1 podman[211521]: 2026-02-16 13:37:50.9402953 +0000 UTC m=+0.078456012 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 13:37:53 compute-1 nova_compute[185910]: 2026-02-16 13:37:53.808 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:54 compute-1 nova_compute[185910]: 2026-02-16 13:37:54.433 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:54 compute-1 ovn_controller[96285]: 2026-02-16T13:37:54Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:9b:d1 10.100.0.11
Feb 16 13:37:54 compute-1 ovn_controller[96285]: 2026-02-16T13:37:54Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:9b:d1 10.100.0.11
Feb 16 13:37:55 compute-1 podman[211578]: 2026-02-16 13:37:55.954475063 +0000 UTC m=+0.098927740 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller)
Feb 16 13:37:58 compute-1 nova_compute[185910]: 2026-02-16 13:37:58.811 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:37:59 compute-1 nova_compute[185910]: 2026-02-16 13:37:59.435 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:03.345 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:03.346 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:03.346 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:03 compute-1 nova_compute[185910]: 2026-02-16 13:38:03.813 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:03 compute-1 podman[211604]: 2026-02-16 13:38:03.943929567 +0000 UTC m=+0.083139177 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:38:04 compute-1 nova_compute[185910]: 2026-02-16 13:38:04.439 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:05 compute-1 podman[195236]: time="2026-02-16T13:38:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:38:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:38:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:38:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:38:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Feb 16 13:38:08 compute-1 nova_compute[185910]: 2026-02-16 13:38:08.814 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:09 compute-1 nova_compute[185910]: 2026-02-16 13:38:09.441 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:13 compute-1 nova_compute[185910]: 2026-02-16 13:38:13.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:13 compute-1 nova_compute[185910]: 2026-02-16 13:38:13.861 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:14 compute-1 nova_compute[185910]: 2026-02-16 13:38:14.445 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:14 compute-1 nova_compute[185910]: 2026-02-16 13:38:14.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:16 compute-1 nova_compute[185910]: 2026-02-16 13:38:16.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:16 compute-1 nova_compute[185910]: 2026-02-16 13:38:16.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.673 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.673 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.673 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.674 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.749 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.805 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.806 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:17 compute-1 nova_compute[185910]: 2026-02-16 13:38:17.861 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.031 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.032 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5660MB free_disk=73.19494247436523GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.033 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.033 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.101 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.102 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.102 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.144 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.163 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.198 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.199 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:18 compute-1 nova_compute[185910]: 2026-02-16 13:38:18.863 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:19 compute-1 nova_compute[185910]: 2026-02-16 13:38:19.452 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:19 compute-1 openstack_network_exporter[198096]: ERROR   13:38:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:38:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:38:19 compute-1 openstack_network_exporter[198096]: ERROR   13:38:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:38:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:38:21 compute-1 sshd-session[211637]: Invalid user admin from 146.190.226.24 port 37450
Feb 16 13:38:21 compute-1 nova_compute[185910]: 2026-02-16 13:38:21.193 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:21 compute-1 podman[211640]: 2026-02-16 13:38:21.247481649 +0000 UTC m=+0.061906718 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 16 13:38:21 compute-1 podman[211639]: 2026-02-16 13:38:21.247478809 +0000 UTC m=+0.061539518 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, distribution-scope=public, release=1770267347, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=openstack_network_exporter)
Feb 16 13:38:21 compute-1 sshd-session[211637]: Connection closed by invalid user admin 146.190.226.24 port 37450 [preauth]
Feb 16 13:38:21 compute-1 nova_compute[185910]: 2026-02-16 13:38:21.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:21 compute-1 nova_compute[185910]: 2026-02-16 13:38:21.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:38:21 compute-1 nova_compute[185910]: 2026-02-16 13:38:21.653 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:38:22 compute-1 ovn_controller[96285]: 2026-02-16T13:38:22Z|00106|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 13:38:23 compute-1 nova_compute[185910]: 2026-02-16 13:38:23.653 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:23 compute-1 nova_compute[185910]: 2026-02-16 13:38:23.654 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:38:23 compute-1 nova_compute[185910]: 2026-02-16 13:38:23.654 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:38:23 compute-1 nova_compute[185910]: 2026-02-16 13:38:23.866 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:24 compute-1 nova_compute[185910]: 2026-02-16 13:38:24.385 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:38:24 compute-1 nova_compute[185910]: 2026-02-16 13:38:24.385 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:38:24 compute-1 nova_compute[185910]: 2026-02-16 13:38:24.386 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:38:24 compute-1 nova_compute[185910]: 2026-02-16 13:38:24.386 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:38:24 compute-1 nova_compute[185910]: 2026-02-16 13:38:24.454 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:25 compute-1 nova_compute[185910]: 2026-02-16 13:38:25.312 185914 DEBUG nova.compute.manager [None req-d28a0c59-4856-432c-b19b-c3f2ce62bdf8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 63898862-3dd6-49b3-9545-63882243296a in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 13:38:25 compute-1 nova_compute[185910]: 2026-02-16 13:38:25.383 185914 DEBUG nova.compute.provider_tree [None req-d28a0c59-4856-432c-b19b-c3f2ce62bdf8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 19 to 21 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:38:26 compute-1 nova_compute[185910]: 2026-02-16 13:38:26.657 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updating instance_info_cache with network_info: [{"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:38:26 compute-1 nova_compute[185910]: 2026-02-16 13:38:26.676 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:38:26 compute-1 nova_compute[185910]: 2026-02-16 13:38:26.676 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:38:26 compute-1 nova_compute[185910]: 2026-02-16 13:38:26.677 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:26 compute-1 nova_compute[185910]: 2026-02-16 13:38:26.677 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:38:26 compute-1 nova_compute[185910]: 2026-02-16 13:38:26.677 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:26 compute-1 podman[211680]: 2026-02-16 13:38:26.946827287 +0000 UTC m=+0.087571855 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:38:28 compute-1 nova_compute[185910]: 2026-02-16 13:38:28.868 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:29 compute-1 nova_compute[185910]: 2026-02-16 13:38:29.457 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:29 compute-1 nova_compute[185910]: 2026-02-16 13:38:29.690 185914 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Check if temp file /var/lib/nova/instances/tmp4juvj7yy exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:38:29 compute-1 nova_compute[185910]: 2026-02-16 13:38:29.691 185914 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4juvj7yy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3c7e1337-03a5-4860-9bdf-2ff0df92ca75',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:38:30 compute-1 nova_compute[185910]: 2026-02-16 13:38:30.267 185914 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:30 compute-1 nova_compute[185910]: 2026-02-16 13:38:30.319 185914 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:30 compute-1 nova_compute[185910]: 2026-02-16 13:38:30.320 185914 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:38:30 compute-1 nova_compute[185910]: 2026-02-16 13:38:30.367 185914 DEBUG oslo_concurrency.processutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:38:31 compute-1 sshd-session[211713]: Accepted publickey for nova from 192.168.122.100 port 36472 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:38:31 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:38:31 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:38:31 compute-1 systemd-logind[821]: New session 38 of user nova.
Feb 16 13:38:31 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:38:31 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:38:31 compute-1 systemd[211717]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:38:32 compute-1 systemd[211717]: Queued start job for default target Main User Target.
Feb 16 13:38:32 compute-1 systemd[211717]: Created slice User Application Slice.
Feb 16 13:38:32 compute-1 systemd[211717]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:38:32 compute-1 systemd[211717]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:38:32 compute-1 systemd[211717]: Reached target Paths.
Feb 16 13:38:32 compute-1 systemd[211717]: Reached target Timers.
Feb 16 13:38:32 compute-1 systemd[211717]: Starting D-Bus User Message Bus Socket...
Feb 16 13:38:32 compute-1 systemd[211717]: Starting Create User's Volatile Files and Directories...
Feb 16 13:38:32 compute-1 systemd[211717]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:38:32 compute-1 systemd[211717]: Reached target Sockets.
Feb 16 13:38:32 compute-1 systemd[211717]: Finished Create User's Volatile Files and Directories.
Feb 16 13:38:32 compute-1 systemd[211717]: Reached target Basic System.
Feb 16 13:38:32 compute-1 systemd[211717]: Reached target Main User Target.
Feb 16 13:38:32 compute-1 systemd[211717]: Startup finished in 138ms.
Feb 16 13:38:32 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:38:32 compute-1 systemd[1]: Started Session 38 of User nova.
Feb 16 13:38:32 compute-1 sshd-session[211713]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:38:32 compute-1 sshd-session[211732]: Received disconnect from 192.168.122.100 port 36472:11: disconnected by user
Feb 16 13:38:32 compute-1 sshd-session[211732]: Disconnected from user nova 192.168.122.100 port 36472
Feb 16 13:38:32 compute-1 sshd-session[211713]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:38:32 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Feb 16 13:38:32 compute-1 systemd-logind[821]: Session 38 logged out. Waiting for processes to exit.
Feb 16 13:38:32 compute-1 systemd-logind[821]: Removed session 38.
Feb 16 13:38:33 compute-1 nova_compute[185910]: 2026-02-16 13:38:33.645 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:38:33 compute-1 nova_compute[185910]: 2026-02-16 13:38:33.646 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:38:33 compute-1 nova_compute[185910]: 2026-02-16 13:38:33.870 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:34 compute-1 nova_compute[185910]: 2026-02-16 13:38:34.459 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:34 compute-1 nova_compute[185910]: 2026-02-16 13:38:34.626 185914 DEBUG nova.compute.manager [req-a7993694-2e0f-4c26-831d-4bf3f31d5c0d req-1ceba13f-7c14-4231-afe3-ca05d3cd8a08 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:34 compute-1 nova_compute[185910]: 2026-02-16 13:38:34.626 185914 DEBUG oslo_concurrency.lockutils [req-a7993694-2e0f-4c26-831d-4bf3f31d5c0d req-1ceba13f-7c14-4231-afe3-ca05d3cd8a08 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:34 compute-1 nova_compute[185910]: 2026-02-16 13:38:34.627 185914 DEBUG oslo_concurrency.lockutils [req-a7993694-2e0f-4c26-831d-4bf3f31d5c0d req-1ceba13f-7c14-4231-afe3-ca05d3cd8a08 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:34 compute-1 nova_compute[185910]: 2026-02-16 13:38:34.627 185914 DEBUG oslo_concurrency.lockutils [req-a7993694-2e0f-4c26-831d-4bf3f31d5c0d req-1ceba13f-7c14-4231-afe3-ca05d3cd8a08 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:34 compute-1 nova_compute[185910]: 2026-02-16 13:38:34.627 185914 DEBUG nova.compute.manager [req-a7993694-2e0f-4c26-831d-4bf3f31d5c0d req-1ceba13f-7c14-4231-afe3-ca05d3cd8a08 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:34 compute-1 nova_compute[185910]: 2026-02-16 13:38:34.627 185914 DEBUG nova.compute.manager [req-a7993694-2e0f-4c26-831d-4bf3f31d5c0d req-1ceba13f-7c14-4231-afe3-ca05d3cd8a08 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:38:34 compute-1 podman[211736]: 2026-02-16 13:38:34.930424967 +0000 UTC m=+0.061050306 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:38:35 compute-1 sshd-session[211734]: Invalid user firedancer from 2.57.122.210 port 60738
Feb 16 13:38:35 compute-1 sshd-session[211734]: Connection closed by invalid user firedancer 2.57.122.210 port 60738 [preauth]
Feb 16 13:38:35 compute-1 podman[195236]: time="2026-02-16T13:38:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:38:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:38:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:38:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:38:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2638 "" "Go-http-client/1.1"
Feb 16 13:38:36 compute-1 nova_compute[185910]: 2026-02-16 13:38:36.732 185914 DEBUG nova.compute.manager [req-98255d42-1851-4dee-91c0-1dd647773630 req-e8a250f6-6f2f-41ed-8616-26bdfcf504c0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:36 compute-1 nova_compute[185910]: 2026-02-16 13:38:36.732 185914 DEBUG oslo_concurrency.lockutils [req-98255d42-1851-4dee-91c0-1dd647773630 req-e8a250f6-6f2f-41ed-8616-26bdfcf504c0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:36 compute-1 nova_compute[185910]: 2026-02-16 13:38:36.733 185914 DEBUG oslo_concurrency.lockutils [req-98255d42-1851-4dee-91c0-1dd647773630 req-e8a250f6-6f2f-41ed-8616-26bdfcf504c0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:36 compute-1 nova_compute[185910]: 2026-02-16 13:38:36.733 185914 DEBUG oslo_concurrency.lockutils [req-98255d42-1851-4dee-91c0-1dd647773630 req-e8a250f6-6f2f-41ed-8616-26bdfcf504c0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:36 compute-1 nova_compute[185910]: 2026-02-16 13:38:36.733 185914 DEBUG nova.compute.manager [req-98255d42-1851-4dee-91c0-1dd647773630 req-e8a250f6-6f2f-41ed-8616-26bdfcf504c0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:36 compute-1 nova_compute[185910]: 2026-02-16 13:38:36.733 185914 WARNING nova.compute.manager [req-98255d42-1851-4dee-91c0-1dd647773630 req-e8a250f6-6f2f-41ed-8616-26bdfcf504c0 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received unexpected event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with vm_state active and task_state migrating.
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.478 185914 INFO nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Took 7.11 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.478 185914 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.498 185914 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4juvj7yy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3c7e1337-03a5-4860-9bdf-2ff0df92ca75',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(025fcc49-f70e-4e9f-b721-10432ce23ad0),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.522 185914 DEBUG nova.objects.instance [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.523 185914 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.524 185914 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.524 185914 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.542 185914 DEBUG nova.virt.libvirt.vif [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:37:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-931541268',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-931541268',id=13,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:37:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-3jycgte6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:37:42Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3c7e1337-03a5-4860-9bdf-2ff0df92ca75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.543 185914 DEBUG nova.network.os_vif_util [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.544 185914 DEBUG nova.network.os_vif_util [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.545 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:38:37 compute-1 nova_compute[185910]:   <mac address="fa:16:3e:7c:9b:d1"/>
Feb 16 13:38:37 compute-1 nova_compute[185910]:   <model type="virtio"/>
Feb 16 13:38:37 compute-1 nova_compute[185910]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:38:37 compute-1 nova_compute[185910]:   <mtu size="1442"/>
Feb 16 13:38:37 compute-1 nova_compute[185910]:   <target dev="tap6eb3ffb6-7a"/>
Feb 16 13:38:37 compute-1 nova_compute[185910]: </interface>
Feb 16 13:38:37 compute-1 nova_compute[185910]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:38:37 compute-1 nova_compute[185910]: 2026-02-16 13:38:37.545 185914 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:38:37 compute-1 sshd-session[211761]: Invalid user postgres from 188.166.42.159 port 42470
Feb 16 13:38:37 compute-1 sshd-session[211761]: Connection closed by invalid user postgres 188.166.42.159 port 42470 [preauth]
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.026 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.027 185914 INFO nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.106 185914 INFO nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.608 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.609 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.851 185914 DEBUG nova.compute.manager [req-f48cbcb6-5d47-4ea0-b633-4ab8919ef273 req-16003d00-bb84-4ece-864a-3eedf929dba1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-changed-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.851 185914 DEBUG nova.compute.manager [req-f48cbcb6-5d47-4ea0-b633-4ab8919ef273 req-16003d00-bb84-4ece-864a-3eedf929dba1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Refreshing instance network info cache due to event network-changed-6eb3ffb6-7a82-44c5-98d8-1fa609426d92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.852 185914 DEBUG oslo_concurrency.lockutils [req-f48cbcb6-5d47-4ea0-b633-4ab8919ef273 req-16003d00-bb84-4ece-864a-3eedf929dba1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.853 185914 DEBUG oslo_concurrency.lockutils [req-f48cbcb6-5d47-4ea0-b633-4ab8919ef273 req-16003d00-bb84-4ece-864a-3eedf929dba1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.854 185914 DEBUG nova.network.neutron [req-f48cbcb6-5d47-4ea0-b633-4ab8919ef273 req-16003d00-bb84-4ece-864a-3eedf929dba1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Refreshing network info cache for port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:38:38 compute-1 nova_compute[185910]: 2026-02-16 13:38:38.873 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:39 compute-1 nova_compute[185910]: 2026-02-16 13:38:39.113 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:38:39 compute-1 nova_compute[185910]: 2026-02-16 13:38:39.114 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:38:39 compute-1 nova_compute[185910]: 2026-02-16 13:38:39.464 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:39 compute-1 nova_compute[185910]: 2026-02-16 13:38:39.618 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:38:39 compute-1 nova_compute[185910]: 2026-02-16 13:38:39.619 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:38:40 compute-1 nova_compute[185910]: 2026-02-16 13:38:40.124 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:38:40 compute-1 nova_compute[185910]: 2026-02-16 13:38:40.125 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:38:40 compute-1 nova_compute[185910]: 2026-02-16 13:38:40.628 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:38:40 compute-1 nova_compute[185910]: 2026-02-16 13:38:40.628 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:38:41 compute-1 nova_compute[185910]: 2026-02-16 13:38:41.132 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:38:41 compute-1 nova_compute[185910]: 2026-02-16 13:38:41.132 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:38:41 compute-1 nova_compute[185910]: 2026-02-16 13:38:41.436 185914 DEBUG nova.network.neutron [req-f48cbcb6-5d47-4ea0-b633-4ab8919ef273 req-16003d00-bb84-4ece-864a-3eedf929dba1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updated VIF entry in instance network info cache for port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:38:41 compute-1 nova_compute[185910]: 2026-02-16 13:38:41.437 185914 DEBUG nova.network.neutron [req-f48cbcb6-5d47-4ea0-b633-4ab8919ef273 req-16003d00-bb84-4ece-864a-3eedf929dba1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Updating instance_info_cache with network_info: [{"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:38:41 compute-1 nova_compute[185910]: 2026-02-16 13:38:41.479 185914 DEBUG oslo_concurrency.lockutils [req-f48cbcb6-5d47-4ea0-b633-4ab8919ef273 req-16003d00-bb84-4ece-864a-3eedf929dba1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3c7e1337-03a5-4860-9bdf-2ff0df92ca75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:38:41 compute-1 nova_compute[185910]: 2026-02-16 13:38:41.637 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:38:41 compute-1 nova_compute[185910]: 2026-02-16 13:38:41.638 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.143 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Current 50 elapsed 4 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.144 185914 DEBUG nova.virt.libvirt.migration [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.227 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249122.2266252, 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.228 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] VM Paused (Lifecycle Event)
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.250 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.256 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.286 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:38:42 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:38:42 compute-1 systemd[211717]: Activating special unit Exit the Session...
Feb 16 13:38:42 compute-1 systemd[211717]: Stopped target Main User Target.
Feb 16 13:38:42 compute-1 systemd[211717]: Stopped target Basic System.
Feb 16 13:38:42 compute-1 systemd[211717]: Stopped target Paths.
Feb 16 13:38:42 compute-1 systemd[211717]: Stopped target Sockets.
Feb 16 13:38:42 compute-1 systemd[211717]: Stopped target Timers.
Feb 16 13:38:42 compute-1 systemd[211717]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:38:42 compute-1 systemd[211717]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:38:42 compute-1 systemd[211717]: Closed D-Bus User Message Bus Socket.
Feb 16 13:38:42 compute-1 systemd[211717]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:38:42 compute-1 systemd[211717]: Removed slice User Application Slice.
Feb 16 13:38:42 compute-1 systemd[211717]: Reached target Shutdown.
Feb 16 13:38:42 compute-1 systemd[211717]: Finished Exit the Session.
Feb 16 13:38:42 compute-1 systemd[211717]: Reached target Exit the Session.
Feb 16 13:38:42 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:38:42 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:38:42 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:38:42 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:38:42 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:38:42 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:38:42 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:38:42 compute-1 kernel: tap6eb3ffb6-7a (unregistering): left promiscuous mode
Feb 16 13:38:42 compute-1 NetworkManager[56388]: <info>  [1771249122.3938] device (tap6eb3ffb6-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.402 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00107|binding|INFO|Releasing lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 from this chassis (sb_readonly=0)
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00108|binding|INFO|Setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 down in Southbound
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00109|binding|INFO|Removing iface tap6eb3ffb6-7a ovn-installed in OVS
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.405 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.414 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:9b:d1 10.100.0.11'], port_security=['fa:16:3e:7c:9b:d1 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3c7e1337-03a5-4860-9bdf-2ff0df92ca75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=6eb3ffb6-7a82-44c5-98d8-1fa609426d92) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.416 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.417 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.419 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[6622c3d3-30f8-4a22-a9d6-1ed9edf1f105]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.420 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.458 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Feb 16 13:38:42 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Consumed 14.637s CPU time.
Feb 16 13:38:42 compute-1 systemd-machined[155419]: Machine qemu-9-instance-0000000d terminated.
Feb 16 13:38:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211503]: [NOTICE]   (211507) : haproxy version is 2.8.14-c23fe91
Feb 16 13:38:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211503]: [NOTICE]   (211507) : path to executable is /usr/sbin/haproxy
Feb 16 13:38:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211503]: [WARNING]  (211507) : Exiting Master process...
Feb 16 13:38:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211503]: [ALERT]    (211507) : Current worker (211509) exited with code 143 (Terminated)
Feb 16 13:38:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[211503]: [WARNING]  (211507) : All workers exited. Exiting... (0)
Feb 16 13:38:42 compute-1 systemd[1]: libpod-6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94.scope: Deactivated successfully.
Feb 16 13:38:42 compute-1 podman[211797]: 2026-02-16 13:38:42.561530918 +0000 UTC m=+0.049687381 container died 6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:38:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94-userdata-shm.mount: Deactivated successfully.
Feb 16 13:38:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-f657aa004c8c34b72561fd329d8955f48c44333db96f3fff724d2eb1187f5c65-merged.mount: Deactivated successfully.
Feb 16 13:38:42 compute-1 kernel: tap6eb3ffb6-7a: entered promiscuous mode
Feb 16 13:38:42 compute-1 NetworkManager[56388]: <info>  [1771249122.5913] manager: (tap6eb3ffb6-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Feb 16 13:38:42 compute-1 systemd-udevd[211791]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:38:42 compute-1 kernel: tap6eb3ffb6-7a (unregistering): left promiscuous mode
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.593 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00110|binding|INFO|Claiming lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for this chassis.
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00111|binding|INFO|6eb3ffb6-7a82-44c5-98d8-1fa609426d92: Claiming fa:16:3e:7c:9b:d1 10.100.0.11
Feb 16 13:38:42 compute-1 podman[211797]: 2026-02-16 13:38:42.599944817 +0000 UTC m=+0.088101280 container cleanup 6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00112|binding|INFO|Setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 ovn-installed in OVS
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00113|binding|INFO|Setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 up in Southbound
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.608 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00114|binding|INFO|Releasing lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 from this chassis (sb_readonly=1)
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00115|if_status|INFO|Not setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 down as sb is readonly
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00116|binding|INFO|Removing iface tap6eb3ffb6-7a ovn-installed in OVS
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.610 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:9b:d1 10.100.0.11'], port_security=['fa:16:3e:7c:9b:d1 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3c7e1337-03a5-4860-9bdf-2ff0df92ca75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=6eb3ffb6-7a82-44c5-98d8-1fa609426d92) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.611 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-1 systemd[1]: libpod-conmon-6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94.scope: Deactivated successfully.
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.614 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00117|binding|INFO|Releasing lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 from this chassis (sb_readonly=0)
Feb 16 13:38:42 compute-1 ovn_controller[96285]: 2026-02-16T13:38:42Z|00118|binding|INFO|Setting lport 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 down in Southbound
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.631 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:9b:d1 10.100.0.11'], port_security=['fa:16:3e:7c:9b:d1 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3c7e1337-03a5-4860-9bdf-2ff0df92ca75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=6eb3ffb6-7a82-44c5-98d8-1fa609426d92) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.643 185914 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.644 185914 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.644 185914 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.646 185914 DEBUG nova.virt.libvirt.guest [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '3c7e1337-03a5-4860-9bdf-2ff0df92ca75' (instance-0000000d) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.646 185914 INFO nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Migration operation has completed
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.647 185914 INFO nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] _post_live_migration() is started..
Feb 16 13:38:42 compute-1 podman[211830]: 2026-02-16 13:38:42.675705355 +0000 UTC m=+0.047935344 container remove 6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.681 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1d78ddf3-64ce-4f34-bd17-83af10400a19]: (4, ('Mon Feb 16 01:38:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94)\n6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94\nMon Feb 16 01:38:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94)\n6cb610a2745d52778ecb74b98036f6de25c36bb92e5c5d6d345d497e56f7bd94\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.683 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[929faa44-42f0-4b36-802b-bf87f261d8d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.684 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.686 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-1 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.693 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.698 185914 DEBUG nova.compute.manager [req-0f5b60c6-2b9d-4a7a-83b7-ee29d490b4e4 req-a1f510be-03ee-450f-94ff-b326ee14eb23 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.699 185914 DEBUG oslo_concurrency.lockutils [req-0f5b60c6-2b9d-4a7a-83b7-ee29d490b4e4 req-a1f510be-03ee-450f-94ff-b326ee14eb23 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.699 185914 DEBUG oslo_concurrency.lockutils [req-0f5b60c6-2b9d-4a7a-83b7-ee29d490b4e4 req-a1f510be-03ee-450f-94ff-b326ee14eb23 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.699 185914 DEBUG oslo_concurrency.lockutils [req-0f5b60c6-2b9d-4a7a-83b7-ee29d490b4e4 req-a1f510be-03ee-450f-94ff-b326ee14eb23 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.700 185914 DEBUG nova.compute.manager [req-0f5b60c6-2b9d-4a7a-83b7-ee29d490b4e4 req-a1f510be-03ee-450f-94ff-b326ee14eb23 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.699 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a9670a-4964-43fb-be22-9a7990ad5ac3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:42 compute-1 nova_compute[185910]: 2026-02-16 13:38:42.700 185914 DEBUG nova.compute.manager [req-0f5b60c6-2b9d-4a7a-83b7-ee29d490b4e4 req-a1f510be-03ee-450f-94ff-b326ee14eb23 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.717 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c2129a22-1696-4906-8806-0f3a11140053]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.720 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[56a97321-5a01-495e-af79-772359c2cdc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.736 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a316cdbc-0cb2-46b1-8782-afab972bb2b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 514606, 'reachable_time': 31232, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211852, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.740 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.740 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[a3cda8bf-f24e-4cf7-9616-862311376fc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:42 compute-1 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.741 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.743 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.743 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[0c83f8d7-9f8e-435d-8b41-446829455762]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.744 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.745 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:38:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:42.745 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[16ff4983-6b11-47c2-8b5d-56a898cd4d24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:38:43 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:43.599 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:38:43 compute-1 nova_compute[185910]: 2026-02-16 13:38:43.599 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:43 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:43.601 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:38:43 compute-1 nova_compute[185910]: 2026-02-16 13:38:43.874 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.013 185914 DEBUG nova.compute.manager [req-1b5a0915-3ded-4092-ba9a-49e8c05eac7c req-bd26cabe-fa43-469d-a262-eae370582b3a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.013 185914 DEBUG oslo_concurrency.lockutils [req-1b5a0915-3ded-4092-ba9a-49e8c05eac7c req-bd26cabe-fa43-469d-a262-eae370582b3a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.014 185914 DEBUG oslo_concurrency.lockutils [req-1b5a0915-3ded-4092-ba9a-49e8c05eac7c req-bd26cabe-fa43-469d-a262-eae370582b3a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.014 185914 DEBUG oslo_concurrency.lockutils [req-1b5a0915-3ded-4092-ba9a-49e8c05eac7c req-bd26cabe-fa43-469d-a262-eae370582b3a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.014 185914 DEBUG nova.compute.manager [req-1b5a0915-3ded-4092-ba9a-49e8c05eac7c req-bd26cabe-fa43-469d-a262-eae370582b3a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.014 185914 DEBUG nova.compute.manager [req-1b5a0915-3ded-4092-ba9a-49e8c05eac7c req-bd26cabe-fa43-469d-a262-eae370582b3a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-unplugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.467 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.861 185914 DEBUG nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.862 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.862 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.862 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.863 185914 DEBUG nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.863 185914 WARNING nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received unexpected event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with vm_state active and task_state migrating.
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.863 185914 DEBUG nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.863 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.864 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.864 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.864 185914 DEBUG nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.864 185914 WARNING nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received unexpected event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with vm_state active and task_state migrating.
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.864 185914 DEBUG nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.865 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.865 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.865 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.865 185914 DEBUG nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.866 185914 WARNING nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received unexpected event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with vm_state active and task_state migrating.
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.866 185914 DEBUG nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.866 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.866 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.867 185914 DEBUG oslo_concurrency.lockutils [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.867 185914 DEBUG nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.867 185914 WARNING nova.compute.manager [req-43c02fa1-d614-4b22-a443-6ae22811c6eb req-4c94909a-c878-4de1-9b29-98a2b6e96280 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received unexpected event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with vm_state active and task_state migrating.
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.878 185914 DEBUG nova.network.neutron [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port 6eb3ffb6-7a82-44c5-98d8-1fa609426d92 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.878 185914 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.879 185914 DEBUG nova.virt.libvirt.vif [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:37:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-931541268',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-931541268',id=13,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:37:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-3jycgte6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:38:27Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3c7e1337-03a5-4860-9bdf-2ff0df92ca75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.880 185914 DEBUG nova.network.os_vif_util [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "address": "fa:16:3e:7c:9b:d1", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6eb3ffb6-7a", "ovs_interfaceid": "6eb3ffb6-7a82-44c5-98d8-1fa609426d92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.881 185914 DEBUG nova.network.os_vif_util [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.881 185914 DEBUG os_vif [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.884 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.884 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6eb3ffb6-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.886 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.887 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.891 185914 INFO os_vif [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:9b:d1,bridge_name='br-int',has_traffic_filtering=True,id=6eb3ffb6-7a82-44c5-98d8-1fa609426d92,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6eb3ffb6-7a')
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.892 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.892 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.893 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.893 185914 DEBUG nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.893 185914 INFO nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Deleting instance files /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75_del
Feb 16 13:38:44 compute-1 nova_compute[185910]: 2026-02-16 13:38:44.894 185914 INFO nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Deletion of /var/lib/nova/instances/3c7e1337-03a5-4860-9bdf-2ff0df92ca75_del complete
Feb 16 13:38:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:38:46.603 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:38:46 compute-1 nova_compute[185910]: 2026-02-16 13:38:46.997 185914 DEBUG nova.compute.manager [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:46 compute-1 nova_compute[185910]: 2026-02-16 13:38:46.998 185914 DEBUG oslo_concurrency.lockutils [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:46 compute-1 nova_compute[185910]: 2026-02-16 13:38:46.998 185914 DEBUG oslo_concurrency.lockutils [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:46 compute-1 nova_compute[185910]: 2026-02-16 13:38:46.999 185914 DEBUG oslo_concurrency.lockutils [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:46 compute-1 nova_compute[185910]: 2026-02-16 13:38:46.999 185914 DEBUG nova.compute.manager [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:46 compute-1 nova_compute[185910]: 2026-02-16 13:38:46.999 185914 WARNING nova.compute.manager [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received unexpected event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with vm_state active and task_state migrating.
Feb 16 13:38:47 compute-1 nova_compute[185910]: 2026-02-16 13:38:46.999 185914 DEBUG nova.compute.manager [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:38:47 compute-1 nova_compute[185910]: 2026-02-16 13:38:47.000 185914 DEBUG oslo_concurrency.lockutils [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:47 compute-1 nova_compute[185910]: 2026-02-16 13:38:47.000 185914 DEBUG oslo_concurrency.lockutils [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:47 compute-1 nova_compute[185910]: 2026-02-16 13:38:47.000 185914 DEBUG oslo_concurrency.lockutils [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:47 compute-1 nova_compute[185910]: 2026-02-16 13:38:47.001 185914 DEBUG nova.compute.manager [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] No waiting events found dispatching network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:38:47 compute-1 nova_compute[185910]: 2026-02-16 13:38:47.001 185914 WARNING nova.compute.manager [req-d9126fc6-41fc-40ef-b1b6-f37f293e47b4 req-5689c337-aec7-4344-bbc3-c7e592e5199f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Received unexpected event network-vif-plugged-6eb3ffb6-7a82-44c5-98d8-1fa609426d92 for instance with vm_state active and task_state migrating.
Feb 16 13:38:48 compute-1 nova_compute[185910]: 2026-02-16 13:38:48.875 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:49 compute-1 openstack_network_exporter[198096]: ERROR   13:38:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:38:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:38:49 compute-1 openstack_network_exporter[198096]: ERROR   13:38:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:38:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:38:49 compute-1 nova_compute[185910]: 2026-02-16 13:38:49.886 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.527 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.528 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.528 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3c7e1337-03a5-4860-9bdf-2ff0df92ca75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.553 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.553 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.553 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.554 185914 DEBUG nova.compute.resource_tracker [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:38:51 compute-1 podman[211854]: 2026-02-16 13:38:51.66888781 +0000 UTC m=+0.072563643 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Feb 16 13:38:51 compute-1 podman[211855]: 2026-02-16 13:38:51.69390886 +0000 UTC m=+0.097089380 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.732 185914 WARNING nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.734 185914 DEBUG nova.compute.resource_tracker [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5800MB free_disk=73.22365951538086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.734 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.734 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.791 185914 DEBUG nova.compute.resource_tracker [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.818 185914 DEBUG nova.compute.resource_tracker [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.860 185914 DEBUG nova.compute.resource_tracker [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration 025fcc49-f70e-4e9f-b721-10432ce23ad0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.861 185914 DEBUG nova.compute.resource_tracker [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.861 185914 DEBUG nova.compute.resource_tracker [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.944 185914 DEBUG nova.compute.provider_tree [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.966 185914 DEBUG nova.scheduler.client.report [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.997 185914 DEBUG nova.compute.resource_tracker [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:38:51 compute-1 nova_compute[185910]: 2026-02-16 13:38:51.997 185914 DEBUG oslo_concurrency.lockutils [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:38:52 compute-1 nova_compute[185910]: 2026-02-16 13:38:52.002 185914 INFO nova.compute.manager [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Feb 16 13:38:52 compute-1 nova_compute[185910]: 2026-02-16 13:38:52.106 185914 INFO nova.scheduler.client.report [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 025fcc49-f70e-4e9f-b721-10432ce23ad0
Feb 16 13:38:52 compute-1 nova_compute[185910]: 2026-02-16 13:38:52.107 185914 DEBUG nova.virt.libvirt.driver [None req-3bcd745a-8ea6-43a2-b0f5-f4f1ec42f113 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:38:53 compute-1 nova_compute[185910]: 2026-02-16 13:38:53.877 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:54 compute-1 nova_compute[185910]: 2026-02-16 13:38:54.889 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:57 compute-1 nova_compute[185910]: 2026-02-16 13:38:57.640 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249122.6395795, 3c7e1337-03a5-4860-9bdf-2ff0df92ca75 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:38:57 compute-1 nova_compute[185910]: 2026-02-16 13:38:57.641 185914 INFO nova.compute.manager [-] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] VM Stopped (Lifecycle Event)
Feb 16 13:38:57 compute-1 nova_compute[185910]: 2026-02-16 13:38:57.685 185914 DEBUG nova.compute.manager [None req-569c6b57-edd8-4fa8-86ba-5bda44128a4b - - - - - -] [instance: 3c7e1337-03a5-4860-9bdf-2ff0df92ca75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:38:57 compute-1 podman[211891]: 2026-02-16 13:38:57.949255921 +0000 UTC m=+0.091157191 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 13:38:58 compute-1 nova_compute[185910]: 2026-02-16 13:38:58.879 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:38:59 compute-1 nova_compute[185910]: 2026-02-16 13:38:59.892 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:03.346 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:03.347 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:03.347 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:03 compute-1 nova_compute[185910]: 2026-02-16 13:39:03.880 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:04 compute-1 nova_compute[185910]: 2026-02-16 13:39:04.895 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:05 compute-1 podman[195236]: time="2026-02-16T13:39:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:39:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:39:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:39:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:39:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Feb 16 13:39:05 compute-1 podman[211918]: 2026-02-16 13:39:05.925843332 +0000 UTC m=+0.062425613 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:39:08 compute-1 nova_compute[185910]: 2026-02-16 13:39:08.881 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:09 compute-1 nova_compute[185910]: 2026-02-16 13:39:09.898 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:13 compute-1 nova_compute[185910]: 2026-02-16 13:39:13.899 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:14 compute-1 nova_compute[185910]: 2026-02-16 13:39:14.648 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:14 compute-1 nova_compute[185910]: 2026-02-16 13:39:14.901 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:15 compute-1 nova_compute[185910]: 2026-02-16 13:39:15.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:17 compute-1 nova_compute[185910]: 2026-02-16 13:39:17.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:17 compute-1 nova_compute[185910]: 2026-02-16 13:39:17.667 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:17 compute-1 nova_compute[185910]: 2026-02-16 13:39:17.668 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:17 compute-1 nova_compute[185910]: 2026-02-16 13:39:17.669 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:17 compute-1 nova_compute[185910]: 2026-02-16 13:39:17.669 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:39:17 compute-1 nova_compute[185910]: 2026-02-16 13:39:17.815 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:39:17 compute-1 nova_compute[185910]: 2026-02-16 13:39:17.816 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5813MB free_disk=73.22365951538086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:39:17 compute-1 nova_compute[185910]: 2026-02-16 13:39:17.816 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:17 compute-1 nova_compute[185910]: 2026-02-16 13:39:17.816 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:18 compute-1 nova_compute[185910]: 2026-02-16 13:39:18.042 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:39:18 compute-1 nova_compute[185910]: 2026-02-16 13:39:18.043 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:39:18 compute-1 nova_compute[185910]: 2026-02-16 13:39:18.081 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:39:18 compute-1 nova_compute[185910]: 2026-02-16 13:39:18.105 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:39:18 compute-1 nova_compute[185910]: 2026-02-16 13:39:18.107 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:39:18 compute-1 nova_compute[185910]: 2026-02-16 13:39:18.108 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:18 compute-1 nova_compute[185910]: 2026-02-16 13:39:18.901 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:19 compute-1 nova_compute[185910]: 2026-02-16 13:39:19.109 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:19 compute-1 nova_compute[185910]: 2026-02-16 13:39:19.109 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:19 compute-1 nova_compute[185910]: 2026-02-16 13:39:19.110 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:19 compute-1 openstack_network_exporter[198096]: ERROR   13:39:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:39:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:39:19 compute-1 openstack_network_exporter[198096]: ERROR   13:39:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:39:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:39:19 compute-1 nova_compute[185910]: 2026-02-16 13:39:19.903 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:20 compute-1 nova_compute[185910]: 2026-02-16 13:39:20.492 185914 DEBUG nova.compute.manager [None req-d44e0603-adc3-46dd-81f2-e79f1ef650fb d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 63898862-3dd6-49b3-9545-63882243296a in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 13:39:20 compute-1 nova_compute[185910]: 2026-02-16 13:39:20.554 185914 DEBUG nova.compute.provider_tree [None req-d44e0603-adc3-46dd-81f2-e79f1ef650fb d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 21 to 24 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:39:21 compute-1 nova_compute[185910]: 2026-02-16 13:39:21.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:21 compute-1 podman[211942]: 2026-02-16 13:39:21.927254075 +0000 UTC m=+0.067896839 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:39:21 compute-1 podman[211943]: 2026-02-16 13:39:21.934497799 +0000 UTC m=+0.069201854 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:39:23 compute-1 nova_compute[185910]: 2026-02-16 13:39:23.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:23 compute-1 nova_compute[185910]: 2026-02-16 13:39:23.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:39:23 compute-1 nova_compute[185910]: 2026-02-16 13:39:23.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:39:23 compute-1 nova_compute[185910]: 2026-02-16 13:39:23.657 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:39:23 compute-1 nova_compute[185910]: 2026-02-16 13:39:23.946 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:24 compute-1 nova_compute[185910]: 2026-02-16 13:39:24.905 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:25 compute-1 nova_compute[185910]: 2026-02-16 13:39:25.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:25 compute-1 nova_compute[185910]: 2026-02-16 13:39:25.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:39:26 compute-1 sshd-session[211982]: Invalid user admin from 146.190.226.24 port 44614
Feb 16 13:39:26 compute-1 sshd-session[211982]: Connection closed by invalid user admin 146.190.226.24 port 44614 [preauth]
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.035 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "70158849-73fd-43e5-a303-7507eec3bf57" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.037 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.061 185914 DEBUG nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.178 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.178 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.185 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.186 185914 INFO nova.compute.claims [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.368 185914 DEBUG nova.compute.provider_tree [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.388 185914 DEBUG nova.scheduler.client.report [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.415 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.416 185914 DEBUG nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.467 185914 DEBUG nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.468 185914 DEBUG nova.network.neutron [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.489 185914 INFO nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.521 185914 DEBUG nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.620 185914 DEBUG nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.622 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.622 185914 INFO nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Creating image(s)
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.623 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.623 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.625 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.641 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.643 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.698 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.699 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.700 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.728 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.792 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.793 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.820 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.821 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.822 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.876 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.877 185914 DEBUG nova.virt.disk.api [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.878 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.928 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.929 185914 DEBUG nova.virt.disk.api [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.930 185914 DEBUG nova.objects.instance [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 70158849-73fd-43e5-a303-7507eec3bf57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.947 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.948 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Ensure instance console log exists: /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.948 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.948 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:27 compute-1 nova_compute[185910]: 2026-02-16 13:39:27.949 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:28 compute-1 nova_compute[185910]: 2026-02-16 13:39:28.513 185914 DEBUG nova.policy [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:39:28 compute-1 nova_compute[185910]: 2026-02-16 13:39:28.947 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:28 compute-1 podman[211999]: 2026-02-16 13:39:28.958837476 +0000 UTC m=+0.092307792 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:39:29 compute-1 sshd-session[212025]: Invalid user postgres from 188.166.42.159 port 38140
Feb 16 13:39:29 compute-1 nova_compute[185910]: 2026-02-16 13:39:29.457 185914 DEBUG nova.network.neutron [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Successfully created port: 4d097b85-b92a-43e4-abbe-66dbafacee3c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:39:29 compute-1 sshd-session[212025]: Connection closed by invalid user postgres 188.166.42.159 port 38140 [preauth]
Feb 16 13:39:29 compute-1 nova_compute[185910]: 2026-02-16 13:39:29.908 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:31 compute-1 nova_compute[185910]: 2026-02-16 13:39:31.270 185914 DEBUG nova.network.neutron [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Successfully updated port: 4d097b85-b92a-43e4-abbe-66dbafacee3c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:39:31 compute-1 nova_compute[185910]: 2026-02-16 13:39:31.292 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-70158849-73fd-43e5-a303-7507eec3bf57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:39:31 compute-1 nova_compute[185910]: 2026-02-16 13:39:31.293 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-70158849-73fd-43e5-a303-7507eec3bf57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:39:31 compute-1 nova_compute[185910]: 2026-02-16 13:39:31.293 185914 DEBUG nova.network.neutron [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:39:31 compute-1 nova_compute[185910]: 2026-02-16 13:39:31.465 185914 DEBUG nova.compute.manager [req-c9260267-171a-4e33-b8e9-ad1ec3f80ae2 req-1e551ec4-4087-4e95-9726-2de205431678 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Received event network-changed-4d097b85-b92a-43e4-abbe-66dbafacee3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:39:31 compute-1 nova_compute[185910]: 2026-02-16 13:39:31.465 185914 DEBUG nova.compute.manager [req-c9260267-171a-4e33-b8e9-ad1ec3f80ae2 req-1e551ec4-4087-4e95-9726-2de205431678 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Refreshing instance network info cache due to event network-changed-4d097b85-b92a-43e4-abbe-66dbafacee3c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:39:31 compute-1 nova_compute[185910]: 2026-02-16 13:39:31.466 185914 DEBUG oslo_concurrency.lockutils [req-c9260267-171a-4e33-b8e9-ad1ec3f80ae2 req-1e551ec4-4087-4e95-9726-2de205431678 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-70158849-73fd-43e5-a303-7507eec3bf57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:39:31 compute-1 nova_compute[185910]: 2026-02-16 13:39:31.527 185914 DEBUG nova.network.neutron [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:39:33 compute-1 nova_compute[185910]: 2026-02-16 13:39:33.949 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.010 185914 DEBUG nova.network.neutron [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Updating instance_info_cache with network_info: [{"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.039 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-70158849-73fd-43e5-a303-7507eec3bf57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.040 185914 DEBUG nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Instance network_info: |[{"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.041 185914 DEBUG oslo_concurrency.lockutils [req-c9260267-171a-4e33-b8e9-ad1ec3f80ae2 req-1e551ec4-4087-4e95-9726-2de205431678 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-70158849-73fd-43e5-a303-7507eec3bf57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.041 185914 DEBUG nova.network.neutron [req-c9260267-171a-4e33-b8e9-ad1ec3f80ae2 req-1e551ec4-4087-4e95-9726-2de205431678 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Refreshing network info cache for port 4d097b85-b92a-43e4-abbe-66dbafacee3c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.047 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Start _get_guest_xml network_info=[{"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.055 185914 WARNING nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.061 185914 DEBUG nova.virt.libvirt.host [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.062 185914 DEBUG nova.virt.libvirt.host [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.067 185914 DEBUG nova.virt.libvirt.host [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.067 185914 DEBUG nova.virt.libvirt.host [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.069 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.070 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.070 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.071 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.071 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.072 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.072 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.072 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.073 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.073 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.074 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.074 185914 DEBUG nova.virt.hardware [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.080 185914 DEBUG nova.virt.libvirt.vif [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:39:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1319434423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1319434423',id=15,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4paisd94',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:39:27Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=70158849-73fd-43e5-a303-7507eec3bf57,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.081 185914 DEBUG nova.network.os_vif_util [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.082 185914 DEBUG nova.network.os_vif_util [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:70:d3,bridge_name='br-int',has_traffic_filtering=True,id=4d097b85-b92a-43e4-abbe-66dbafacee3c,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d097b85-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.084 185914 DEBUG nova.objects.instance [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 70158849-73fd-43e5-a303-7507eec3bf57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.104 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <uuid>70158849-73fd-43e5-a303-7507eec3bf57</uuid>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <name>instance-0000000f</name>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteStrategies-server-1319434423</nova:name>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:39:34</nova:creationTime>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:39:34 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:39:34 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:39:34 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:39:34 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:39:34 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:39:34 compute-1 nova_compute[185910]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:39:34 compute-1 nova_compute[185910]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:39:34 compute-1 nova_compute[185910]:         <nova:port uuid="4d097b85-b92a-43e4-abbe-66dbafacee3c">
Feb 16 13:39:34 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <system>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <entry name="serial">70158849-73fd-43e5-a303-7507eec3bf57</entry>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <entry name="uuid">70158849-73fd-43e5-a303-7507eec3bf57</entry>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     </system>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <os>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   </os>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <features>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   </features>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk.config"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:aa:70:d3"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <target dev="tap4d097b85-b9"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/console.log" append="off"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <video>
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     </video>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:39:34 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:39:34 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:39:34 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:39:34 compute-1 nova_compute[185910]: </domain>
Feb 16 13:39:34 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.106 185914 DEBUG nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Preparing to wait for external event network-vif-plugged-4d097b85-b92a-43e4-abbe-66dbafacee3c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.107 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "70158849-73fd-43e5-a303-7507eec3bf57-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.107 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.107 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.108 185914 DEBUG nova.virt.libvirt.vif [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:39:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1319434423',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1319434423',id=15,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4paisd94',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:39:27Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=70158849-73fd-43e5-a303-7507eec3bf57,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.109 185914 DEBUG nova.network.os_vif_util [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.110 185914 DEBUG nova.network.os_vif_util [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:70:d3,bridge_name='br-int',has_traffic_filtering=True,id=4d097b85-b92a-43e4-abbe-66dbafacee3c,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d097b85-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.110 185914 DEBUG os_vif [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:70:d3,bridge_name='br-int',has_traffic_filtering=True,id=4d097b85-b92a-43e4-abbe-66dbafacee3c,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d097b85-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.111 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.111 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.111 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.117 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.117 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d097b85-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.118 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4d097b85-b9, col_values=(('external_ids', {'iface-id': '4d097b85-b92a-43e4-abbe-66dbafacee3c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:aa:70:d3', 'vm-uuid': '70158849-73fd-43e5-a303-7507eec3bf57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.120 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:34 compute-1 NetworkManager[56388]: <info>  [1771249174.1216] manager: (tap4d097b85-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.123 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.127 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.128 185914 INFO os_vif [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:70:d3,bridge_name='br-int',has_traffic_filtering=True,id=4d097b85-b92a-43e4-abbe-66dbafacee3c,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d097b85-b9')
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.198 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.199 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.199 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:aa:70:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.200 185914 INFO nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Using config drive
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.786 185914 INFO nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Creating config drive at /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk.config
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.790 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqm8hkve0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.917 185914 DEBUG oslo_concurrency.processutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqm8hkve0" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:39:34 compute-1 kernel: tap4d097b85-b9: entered promiscuous mode
Feb 16 13:39:34 compute-1 NetworkManager[56388]: <info>  [1771249174.9814] manager: (tap4d097b85-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.981 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:34 compute-1 ovn_controller[96285]: 2026-02-16T13:39:34Z|00119|binding|INFO|Claiming lport 4d097b85-b92a-43e4-abbe-66dbafacee3c for this chassis.
Feb 16 13:39:34 compute-1 ovn_controller[96285]: 2026-02-16T13:39:34Z|00120|binding|INFO|4d097b85-b92a-43e4-abbe-66dbafacee3c: Claiming fa:16:3e:aa:70:d3 10.100.0.12
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.989 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:34 compute-1 ovn_controller[96285]: 2026-02-16T13:39:34Z|00121|binding|INFO|Setting lport 4d097b85-b92a-43e4-abbe-66dbafacee3c ovn-installed in OVS
Feb 16 13:39:34 compute-1 nova_compute[185910]: 2026-02-16 13:39:34.992 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:34 compute-1 ovn_controller[96285]: 2026-02-16T13:39:34Z|00122|binding|INFO|Setting lport 4d097b85-b92a-43e4-abbe-66dbafacee3c up in Southbound
Feb 16 13:39:34 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:34.993 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:70:d3 10.100.0.12'], port_security=['fa:16:3e:aa:70:d3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '70158849-73fd-43e5-a303-7507eec3bf57', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=4d097b85-b92a-43e4-abbe-66dbafacee3c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:39:34 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:34.994 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 4d097b85-b92a-43e4-abbe-66dbafacee3c in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:39:34 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:34.995 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.002 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[00161f8a-c66c-46bd-9c39-a856b9d05eb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.004 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.007 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.008 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[9567473f-6d34-4c9a-846a-88b641376b5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.008 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce794cc-7cd7-49cb-81ca-2b824f308552]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 systemd-machined[155419]: New machine qemu-10-instance-0000000f.
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.016 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ce2bd1-0431-40a1-b944-2cfaa8a139b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.027 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[fa828e42-71bc-4c7a-9062-20fb0dbd7995]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-0000000f.
Feb 16 13:39:35 compute-1 systemd-udevd[212051]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:39:35 compute-1 NetworkManager[56388]: <info>  [1771249175.0548] device (tap4d097b85-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:39:35 compute-1 NetworkManager[56388]: <info>  [1771249175.0562] device (tap4d097b85-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.056 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[293a7eb3-a20c-42c7-9409-ee73197abb60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 NetworkManager[56388]: <info>  [1771249175.0646] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Feb 16 13:39:35 compute-1 systemd-udevd[212053]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.064 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c0040d-df79-4345-835f-18a51bd19aa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.094 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5ed72d-91e5-4dfc-8894-42d1c586aa24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.098 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[020618c4-e0e4-410a-aecd-c1c35d7e1fd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 NetworkManager[56388]: <info>  [1771249175.1147] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.116 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[907b7be2-a577-4b44-8c84-bfe92489a433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.130 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9c7416-1eba-47e6-8eef-4444f36e54b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525953, 'reachable_time': 27520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212079, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.143 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[93c9dc26-9569-4963-accc-b35a4ad15719]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525953, 'tstamp': 525953}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212080, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.155 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[dd8aa9ef-3cfe-4a82-80a1-77feee68ed9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525953, 'reachable_time': 27520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212081, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.174 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1187a6-ac00-48e1-8440-0d59403f9aea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.210 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[d471c5c1-a2b8-46e7-905e-dd31cbc1df9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.212 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.213 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.215 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:35 compute-1 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.220 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:35 compute-1 ovn_controller[96285]: 2026-02-16T13:39:35Z|00123|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:39:35 compute-1 NetworkManager[56388]: <info>  [1771249175.2276] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.229 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.230 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a758b3ee-6ef7-43e3-bee3-0cab169200c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.231 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:39:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:35.231 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:39:35 compute-1 nova_compute[185910]: 2026-02-16 13:39:35.229 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:35 compute-1 podman[212113]: 2026-02-16 13:39:35.540148014 +0000 UTC m=+0.050617926 container create 9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:39:35 compute-1 systemd[1]: Started libpod-conmon-9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a.scope.
Feb 16 13:39:35 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:39:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bccf6bf6fbce8d1a6019c94c34a126818264978283546a296450902b29fb5331/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:39:35 compute-1 podman[212113]: 2026-02-16 13:39:35.60012892 +0000 UTC m=+0.110598632 container init 9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:39:35 compute-1 podman[212113]: 2026-02-16 13:39:35.604224499 +0000 UTC m=+0.114694211 container start 9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 16 13:39:35 compute-1 podman[212113]: 2026-02-16 13:39:35.514923139 +0000 UTC m=+0.025392881 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:39:35 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212128]: [NOTICE]   (212132) : New worker (212134) forked
Feb 16 13:39:35 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212128]: [NOTICE]   (212132) : Loading success.
Feb 16 13:39:35 compute-1 podman[195236]: time="2026-02-16T13:39:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:39:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:39:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:39:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:39:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2623 "" "Go-http-client/1.1"
Feb 16 13:39:35 compute-1 nova_compute[185910]: 2026-02-16 13:39:35.671 185914 DEBUG nova.compute.manager [req-846f2106-0a77-4ecf-9fd4-8d3fa2422938 req-792b9322-3541-482c-9bdf-67cb100e1d92 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Received event network-vif-plugged-4d097b85-b92a-43e4-abbe-66dbafacee3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:39:35 compute-1 nova_compute[185910]: 2026-02-16 13:39:35.672 185914 DEBUG oslo_concurrency.lockutils [req-846f2106-0a77-4ecf-9fd4-8d3fa2422938 req-792b9322-3541-482c-9bdf-67cb100e1d92 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "70158849-73fd-43e5-a303-7507eec3bf57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:35 compute-1 nova_compute[185910]: 2026-02-16 13:39:35.672 185914 DEBUG oslo_concurrency.lockutils [req-846f2106-0a77-4ecf-9fd4-8d3fa2422938 req-792b9322-3541-482c-9bdf-67cb100e1d92 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:35 compute-1 nova_compute[185910]: 2026-02-16 13:39:35.673 185914 DEBUG oslo_concurrency.lockutils [req-846f2106-0a77-4ecf-9fd4-8d3fa2422938 req-792b9322-3541-482c-9bdf-67cb100e1d92 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:35 compute-1 nova_compute[185910]: 2026-02-16 13:39:35.673 185914 DEBUG nova.compute.manager [req-846f2106-0a77-4ecf-9fd4-8d3fa2422938 req-792b9322-3541-482c-9bdf-67cb100e1d92 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Processing event network-vif-plugged-4d097b85-b92a-43e4-abbe-66dbafacee3c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.144 185914 DEBUG nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.145 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249176.1449895, 70158849-73fd-43e5-a303-7507eec3bf57 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.145 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] VM Started (Lifecycle Event)
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.149 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.152 185914 INFO nova.virt.libvirt.driver [-] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Instance spawned successfully.
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.152 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.181 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.188 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.191 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.191 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.192 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.192 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.192 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.193 185914 DEBUG nova.virt.libvirt.driver [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.245 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.246 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249176.1460695, 70158849-73fd-43e5-a303-7507eec3bf57 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.246 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] VM Paused (Lifecycle Event)
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.280 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.284 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249176.1479895, 70158849-73fd-43e5-a303-7507eec3bf57 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.284 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] VM Resumed (Lifecycle Event)
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.304 185914 INFO nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Took 8.68 seconds to spawn the instance on the hypervisor.
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.304 185914 DEBUG nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.311 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.314 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.355 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.381 185914 INFO nova.compute.manager [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Took 9.23 seconds to build instance.
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.407 185914 DEBUG oslo_concurrency.lockutils [None req-670d2de4-90ea-4b3e-99d8-7eb5e2e55399 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.511 185914 DEBUG nova.network.neutron [req-c9260267-171a-4e33-b8e9-ad1ec3f80ae2 req-1e551ec4-4087-4e95-9726-2de205431678 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Updated VIF entry in instance network info cache for port 4d097b85-b92a-43e4-abbe-66dbafacee3c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.511 185914 DEBUG nova.network.neutron [req-c9260267-171a-4e33-b8e9-ad1ec3f80ae2 req-1e551ec4-4087-4e95-9726-2de205431678 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Updating instance_info_cache with network_info: [{"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:39:36 compute-1 nova_compute[185910]: 2026-02-16 13:39:36.537 185914 DEBUG oslo_concurrency.lockutils [req-c9260267-171a-4e33-b8e9-ad1ec3f80ae2 req-1e551ec4-4087-4e95-9726-2de205431678 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-70158849-73fd-43e5-a303-7507eec3bf57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:39:36 compute-1 podman[212150]: 2026-02-16 13:39:36.919344907 +0000 UTC m=+0.050226436 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:39:37 compute-1 nova_compute[185910]: 2026-02-16 13:39:37.831 185914 DEBUG nova.compute.manager [req-3dbdb9df-e3bb-4e4d-b934-145b8975e7a5 req-4d3f1496-4267-4ec0-a916-790b718a2110 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Received event network-vif-plugged-4d097b85-b92a-43e4-abbe-66dbafacee3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:39:37 compute-1 nova_compute[185910]: 2026-02-16 13:39:37.831 185914 DEBUG oslo_concurrency.lockutils [req-3dbdb9df-e3bb-4e4d-b934-145b8975e7a5 req-4d3f1496-4267-4ec0-a916-790b718a2110 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "70158849-73fd-43e5-a303-7507eec3bf57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:39:37 compute-1 nova_compute[185910]: 2026-02-16 13:39:37.831 185914 DEBUG oslo_concurrency.lockutils [req-3dbdb9df-e3bb-4e4d-b934-145b8975e7a5 req-4d3f1496-4267-4ec0-a916-790b718a2110 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:39:37 compute-1 nova_compute[185910]: 2026-02-16 13:39:37.832 185914 DEBUG oslo_concurrency.lockutils [req-3dbdb9df-e3bb-4e4d-b934-145b8975e7a5 req-4d3f1496-4267-4ec0-a916-790b718a2110 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:39:37 compute-1 nova_compute[185910]: 2026-02-16 13:39:37.832 185914 DEBUG nova.compute.manager [req-3dbdb9df-e3bb-4e4d-b934-145b8975e7a5 req-4d3f1496-4267-4ec0-a916-790b718a2110 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] No waiting events found dispatching network-vif-plugged-4d097b85-b92a-43e4-abbe-66dbafacee3c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:39:37 compute-1 nova_compute[185910]: 2026-02-16 13:39:37.832 185914 WARNING nova.compute.manager [req-3dbdb9df-e3bb-4e4d-b934-145b8975e7a5 req-4d3f1496-4267-4ec0-a916-790b718a2110 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Received unexpected event network-vif-plugged-4d097b85-b92a-43e4-abbe-66dbafacee3c for instance with vm_state active and task_state None.
Feb 16 13:39:38 compute-1 nova_compute[185910]: 2026-02-16 13:39:38.963 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:39 compute-1 nova_compute[185910]: 2026-02-16 13:39:39.121 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:43 compute-1 nova_compute[185910]: 2026-02-16 13:39:43.965 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:44 compute-1 nova_compute[185910]: 2026-02-16 13:39:44.123 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:48 compute-1 nova_compute[185910]: 2026-02-16 13:39:48.968 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:48 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:48.996 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:39:48 compute-1 nova_compute[185910]: 2026-02-16 13:39:48.996 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:48 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:48.997 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:39:49 compute-1 nova_compute[185910]: 2026-02-16 13:39:49.125 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:49 compute-1 ovn_controller[96285]: 2026-02-16T13:39:49Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:aa:70:d3 10.100.0.12
Feb 16 13:39:49 compute-1 ovn_controller[96285]: 2026-02-16T13:39:49Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:aa:70:d3 10.100.0.12
Feb 16 13:39:49 compute-1 openstack_network_exporter[198096]: ERROR   13:39:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:39:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:39:49 compute-1 openstack_network_exporter[198096]: ERROR   13:39:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:39:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:39:52 compute-1 podman[212184]: 2026-02-16 13:39:52.916454873 +0000 UTC m=+0.050590586 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:39:52 compute-1 podman[212183]: 2026-02-16 13:39:52.94213012 +0000 UTC m=+0.072634385 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, release=1770267347, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public)
Feb 16 13:39:54 compute-1 nova_compute[185910]: 2026-02-16 13:39:54.019 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:54 compute-1 nova_compute[185910]: 2026-02-16 13:39:54.127 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:56 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:39:56.999 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:39:59 compute-1 nova_compute[185910]: 2026-02-16 13:39:59.021 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:59 compute-1 nova_compute[185910]: 2026-02-16 13:39:59.129 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:39:59 compute-1 podman[212226]: 2026-02-16 13:39:59.919319948 +0000 UTC m=+0.063613044 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_managed=true)
Feb 16 13:40:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:03.348 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:03.349 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:03.350 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:04 compute-1 nova_compute[185910]: 2026-02-16 13:40:04.023 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:04 compute-1 nova_compute[185910]: 2026-02-16 13:40:04.131 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:05 compute-1 podman[195236]: time="2026-02-16T13:40:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:40:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:40:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:40:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:40:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2628 "" "Go-http-client/1.1"
Feb 16 13:40:07 compute-1 podman[212253]: 2026-02-16 13:40:07.907851817 +0000 UTC m=+0.051590142 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:40:09 compute-1 nova_compute[185910]: 2026-02-16 13:40:09.025 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:09 compute-1 nova_compute[185910]: 2026-02-16 13:40:09.133 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:14 compute-1 nova_compute[185910]: 2026-02-16 13:40:14.027 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:14 compute-1 nova_compute[185910]: 2026-02-16 13:40:14.135 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:14 compute-1 nova_compute[185910]: 2026-02-16 13:40:14.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:17 compute-1 nova_compute[185910]: 2026-02-16 13:40:17.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.675 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.676 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.676 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.676 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.767 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.817 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.819 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:18 compute-1 nova_compute[185910]: 2026-02-16 13:40:18.872 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.029 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:19 compute-1 ovn_controller[96285]: 2026-02-16T13:40:19Z|00124|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.058 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.059 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5652MB free_disk=73.19489669799805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.060 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.060 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.138 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.196 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 70158849-73fd-43e5-a303-7507eec3bf57 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.197 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.198 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.275 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.322 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.357 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:40:19 compute-1 nova_compute[185910]: 2026-02-16 13:40:19.357 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:19 compute-1 openstack_network_exporter[198096]: ERROR   13:40:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:40:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:40:19 compute-1 openstack_network_exporter[198096]: ERROR   13:40:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:40:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:40:21 compute-1 nova_compute[185910]: 2026-02-16 13:40:21.356 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:21 compute-1 nova_compute[185910]: 2026-02-16 13:40:21.357 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:22 compute-1 sshd-session[212284]: Invalid user postgres from 188.166.42.159 port 57234
Feb 16 13:40:23 compute-1 sshd-session[212284]: Connection closed by invalid user postgres 188.166.42.159 port 57234 [preauth]
Feb 16 13:40:23 compute-1 podman[212286]: 2026-02-16 13:40:23.062485377 +0000 UTC m=+0.079206032 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., architecture=x86_64)
Feb 16 13:40:23 compute-1 podman[212287]: 2026-02-16 13:40:23.080825328 +0000 UTC m=+0.095398685 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:40:23 compute-1 nova_compute[185910]: 2026-02-16 13:40:23.627 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:23 compute-1 nova_compute[185910]: 2026-02-16 13:40:23.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:23 compute-1 nova_compute[185910]: 2026-02-16 13:40:23.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:40:23 compute-1 nova_compute[185910]: 2026-02-16 13:40:23.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:40:23 compute-1 nova_compute[185910]: 2026-02-16 13:40:23.918 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-70158849-73fd-43e5-a303-7507eec3bf57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:40:23 compute-1 nova_compute[185910]: 2026-02-16 13:40:23.919 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-70158849-73fd-43e5-a303-7507eec3bf57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:40:23 compute-1 nova_compute[185910]: 2026-02-16 13:40:23.919 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:40:23 compute-1 nova_compute[185910]: 2026-02-16 13:40:23.920 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 70158849-73fd-43e5-a303-7507eec3bf57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:40:24 compute-1 nova_compute[185910]: 2026-02-16 13:40:24.031 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:24 compute-1 nova_compute[185910]: 2026-02-16 13:40:24.140 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:26 compute-1 nova_compute[185910]: 2026-02-16 13:40:26.596 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Updating instance_info_cache with network_info: [{"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:40:26 compute-1 nova_compute[185910]: 2026-02-16 13:40:26.623 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-70158849-73fd-43e5-a303-7507eec3bf57" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:40:26 compute-1 nova_compute[185910]: 2026-02-16 13:40:26.623 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:40:26 compute-1 nova_compute[185910]: 2026-02-16 13:40:26.624 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:40:26 compute-1 nova_compute[185910]: 2026-02-16 13:40:26.624 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:40:27 compute-1 nova_compute[185910]: 2026-02-16 13:40:27.132 185914 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Creating tmpfile /var/lib/nova/instances/tmp332w3zu3 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:40:27 compute-1 nova_compute[185910]: 2026-02-16 13:40:27.266 185914 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp332w3zu3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:40:29 compute-1 nova_compute[185910]: 2026-02-16 13:40:29.033 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:29 compute-1 nova_compute[185910]: 2026-02-16 13:40:29.142 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:29 compute-1 nova_compute[185910]: 2026-02-16 13:40:29.530 185914 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp332w3zu3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='93d211b1-f197-4c96-a994-900df3bf28e4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:40:29 compute-1 nova_compute[185910]: 2026-02-16 13:40:29.570 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:40:29 compute-1 nova_compute[185910]: 2026-02-16 13:40:29.571 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:40:29 compute-1 nova_compute[185910]: 2026-02-16 13:40:29.571 185914 DEBUG nova.network.neutron [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:40:30 compute-1 podman[212328]: 2026-02-16 13:40:30.931856118 +0000 UTC m=+0.070281253 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 16 13:40:32 compute-1 sshd-session[212355]: Invalid user admin from 146.190.226.24 port 36598
Feb 16 13:40:32 compute-1 sshd-session[212355]: Connection closed by invalid user admin 146.190.226.24 port 36598 [preauth]
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.085 185914 DEBUG nova.network.neutron [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updating instance_info_cache with network_info: [{"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.115 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.117 185914 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp332w3zu3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='93d211b1-f197-4c96-a994-900df3bf28e4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.118 185914 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Creating instance directory: /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.118 185914 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Creating disk.info with the contents: {'/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk': 'qcow2', '/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.119 185914 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.119 185914 DEBUG nova.objects.instance [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 93d211b1-f197-4c96-a994-900df3bf28e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.163 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.241 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.243 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.244 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.263 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.310 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.311 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.349 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.351 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.351 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.395 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.397 185914 DEBUG nova.virt.disk.api [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.398 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.445 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.447 185914 DEBUG nova.virt.disk.api [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.448 185914 DEBUG nova.objects.instance [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 93d211b1-f197-4c96-a994-900df3bf28e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.473 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.493 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config 485376" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.496 185914 DEBUG nova.virt.libvirt.volume.remotefs [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config to /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.496 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.937 185914 DEBUG oslo_concurrency.processutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4/disk.config /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.938 185914 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.940 185914 DEBUG nova.virt.libvirt.vif [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:39:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-804254472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-804254472',id=16,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:39:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-c0ijme3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:39:56Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=93d211b1-f197-4c96-a994-900df3bf28e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.940 185914 DEBUG nova.network.os_vif_util [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.942 185914 DEBUG nova.network.os_vif_util [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.943 185914 DEBUG os_vif [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.943 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.944 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.945 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.948 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.948 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3dd3d50b-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.949 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3dd3d50b-ad, col_values=(('external_ids', {'iface-id': '3dd3d50b-ad63-4bee-b823-c23750e7afc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:e6:11', 'vm-uuid': '93d211b1-f197-4c96-a994-900df3bf28e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.950 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:33 compute-1 NetworkManager[56388]: <info>  [1771249233.9517] manager: (tap3dd3d50b-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.952 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.957 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.958 185914 INFO os_vif [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad')
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.959 185914 DEBUG nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:40:33 compute-1 nova_compute[185910]: 2026-02-16 13:40:33.959 185914 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp332w3zu3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='93d211b1-f197-4c96-a994-900df3bf28e4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:40:34 compute-1 nova_compute[185910]: 2026-02-16 13:40:34.033 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:35 compute-1 podman[195236]: time="2026-02-16T13:40:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:40:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:40:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:40:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:40:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 16 13:40:35 compute-1 nova_compute[185910]: 2026-02-16 13:40:35.995 185914 DEBUG nova.network.neutron [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Port 3dd3d50b-ad63-4bee-b823-c23750e7afc1 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:40:35 compute-1 nova_compute[185910]: 2026-02-16 13:40:35.998 185914 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp332w3zu3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='93d211b1-f197-4c96-a994-900df3bf28e4',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:40:36 compute-1 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:40:36 compute-1 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:40:36 compute-1 NetworkManager[56388]: <info>  [1771249236.3558] manager: (tap3dd3d50b-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Feb 16 13:40:36 compute-1 kernel: tap3dd3d50b-ad: entered promiscuous mode
Feb 16 13:40:36 compute-1 ovn_controller[96285]: 2026-02-16T13:40:36Z|00125|binding|INFO|Claiming lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 for this additional chassis.
Feb 16 13:40:36 compute-1 ovn_controller[96285]: 2026-02-16T13:40:36Z|00126|binding|INFO|3dd3d50b-ad63-4bee-b823-c23750e7afc1: Claiming fa:16:3e:dd:e6:11 10.100.0.8
Feb 16 13:40:36 compute-1 nova_compute[185910]: 2026-02-16 13:40:36.358 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:36 compute-1 nova_compute[185910]: 2026-02-16 13:40:36.364 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:36 compute-1 ovn_controller[96285]: 2026-02-16T13:40:36Z|00127|binding|INFO|Setting lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 ovn-installed in OVS
Feb 16 13:40:36 compute-1 nova_compute[185910]: 2026-02-16 13:40:36.368 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:36 compute-1 systemd-machined[155419]: New machine qemu-11-instance-00000010.
Feb 16 13:40:36 compute-1 systemd[1]: Started Virtual Machine qemu-11-instance-00000010.
Feb 16 13:40:36 compute-1 systemd-udevd[212413]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:40:36 compute-1 NetworkManager[56388]: <info>  [1771249236.4357] device (tap3dd3d50b-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:40:36 compute-1 NetworkManager[56388]: <info>  [1771249236.4368] device (tap3dd3d50b-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:40:36 compute-1 nova_compute[185910]: 2026-02-16 13:40:36.841 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249236.8404803, 93d211b1-f197-4c96-a994-900df3bf28e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:40:36 compute-1 nova_compute[185910]: 2026-02-16 13:40:36.843 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] VM Started (Lifecycle Event)
Feb 16 13:40:36 compute-1 nova_compute[185910]: 2026-02-16 13:40:36.874 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:40:37 compute-1 nova_compute[185910]: 2026-02-16 13:40:37.648 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249237.6476483, 93d211b1-f197-4c96-a994-900df3bf28e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:40:37 compute-1 nova_compute[185910]: 2026-02-16 13:40:37.648 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] VM Resumed (Lifecycle Event)
Feb 16 13:40:37 compute-1 nova_compute[185910]: 2026-02-16 13:40:37.755 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:40:37 compute-1 nova_compute[185910]: 2026-02-16 13:40:37.758 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:40:37 compute-1 nova_compute[185910]: 2026-02-16 13:40:37.792 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Feb 16 13:40:38 compute-1 podman[212442]: 2026-02-16 13:40:38.916921184 +0000 UTC m=+0.054070754 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:40:38 compute-1 nova_compute[185910]: 2026-02-16 13:40:38.951 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:39 compute-1 nova_compute[185910]: 2026-02-16 13:40:39.035 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:39 compute-1 ovn_controller[96285]: 2026-02-16T13:40:39Z|00128|binding|INFO|Claiming lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 for this chassis.
Feb 16 13:40:39 compute-1 ovn_controller[96285]: 2026-02-16T13:40:39Z|00129|binding|INFO|3dd3d50b-ad63-4bee-b823-c23750e7afc1: Claiming fa:16:3e:dd:e6:11 10.100.0.8
Feb 16 13:40:39 compute-1 ovn_controller[96285]: 2026-02-16T13:40:39Z|00130|binding|INFO|Setting lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 up in Southbound
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.515 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:e6:11 10.100.0.8'], port_security=['fa:16:3e:dd:e6:11 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '93d211b1-f197-4c96-a994-900df3bf28e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '11', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=3dd3d50b-ad63-4bee-b823-c23750e7afc1) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.517 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 3dd3d50b-ad63-4bee-b823-c23750e7afc1 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.519 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.530 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ded903-a921-42e9-80d6-c5e5fdf185f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.554 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[48ceb3f6-e2b9-4330-bc96-9e6b1d364da7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.557 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[baacb10a-13cc-4d7c-ab94-93e49088aae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.575 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[aefe95ce-b85e-44c1-b558-8de94bd92d8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.587 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1a8e93d2-0727-4a60-95fd-1cb14b8ced80]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525953, 'reachable_time': 36519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212474, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.602 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[4d38d7dd-7c3a-4f46-9563-2e093f05e31d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525961, 'tstamp': 525961}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212475, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525962, 'tstamp': 525962}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212475, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.603 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:39 compute-1 nova_compute[185910]: 2026-02-16 13:40:39.606 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:39 compute-1 nova_compute[185910]: 2026-02-16 13:40:39.607 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.608 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.608 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.609 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:39.609 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:40:39 compute-1 nova_compute[185910]: 2026-02-16 13:40:39.763 185914 INFO nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Post operation of migration started
Feb 16 13:40:40 compute-1 nova_compute[185910]: 2026-02-16 13:40:40.495 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:40:40 compute-1 nova_compute[185910]: 2026-02-16 13:40:40.496 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:40:40 compute-1 nova_compute[185910]: 2026-02-16 13:40:40.496 185914 DEBUG nova.network.neutron [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:40:42 compute-1 nova_compute[185910]: 2026-02-16 13:40:42.215 185914 DEBUG nova.network.neutron [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updating instance_info_cache with network_info: [{"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:40:42 compute-1 nova_compute[185910]: 2026-02-16 13:40:42.265 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-93d211b1-f197-4c96-a994-900df3bf28e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:40:42 compute-1 nova_compute[185910]: 2026-02-16 13:40:42.291 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:42 compute-1 nova_compute[185910]: 2026-02-16 13:40:42.292 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:42 compute-1 nova_compute[185910]: 2026-02-16 13:40:42.292 185914 DEBUG oslo_concurrency.lockutils [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:42 compute-1 nova_compute[185910]: 2026-02-16 13:40:42.299 185914 INFO nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:40:42 compute-1 virtqemud[185025]: Domain id=11 name='instance-00000010' uuid=93d211b1-f197-4c96-a994-900df3bf28e4 is tainted: custom-monitor
Feb 16 13:40:43 compute-1 nova_compute[185910]: 2026-02-16 13:40:43.308 185914 INFO nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:40:43 compute-1 nova_compute[185910]: 2026-02-16 13:40:43.952 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:44 compute-1 nova_compute[185910]: 2026-02-16 13:40:44.037 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:44 compute-1 nova_compute[185910]: 2026-02-16 13:40:44.314 185914 INFO nova.virt.libvirt.driver [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:40:44 compute-1 nova_compute[185910]: 2026-02-16 13:40:44.320 185914 DEBUG nova.compute.manager [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:40:44 compute-1 nova_compute[185910]: 2026-02-16 13:40:44.343 185914 DEBUG nova.objects.instance [None req-04e0a887-f5eb-43c8-b042-ca4d450ef6b8 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:40:48 compute-1 nova_compute[185910]: 2026-02-16 13:40:48.955 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:49 compute-1 nova_compute[185910]: 2026-02-16 13:40:49.039 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:49 compute-1 openstack_network_exporter[198096]: ERROR   13:40:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:40:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:40:49 compute-1 openstack_network_exporter[198096]: ERROR   13:40:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:40:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.814 185914 DEBUG oslo_concurrency.lockutils [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.815 185914 DEBUG oslo_concurrency.lockutils [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.815 185914 DEBUG oslo_concurrency.lockutils [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.815 185914 DEBUG oslo_concurrency.lockutils [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.816 185914 DEBUG oslo_concurrency.lockutils [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.817 185914 INFO nova.compute.manager [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Terminating instance
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.818 185914 DEBUG nova.compute.manager [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:40:51 compute-1 kernel: tap3dd3d50b-ad (unregistering): left promiscuous mode
Feb 16 13:40:51 compute-1 NetworkManager[56388]: <info>  [1771249251.8465] device (tap3dd3d50b-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:40:51 compute-1 ovn_controller[96285]: 2026-02-16T13:40:51Z|00131|binding|INFO|Releasing lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 from this chassis (sb_readonly=0)
Feb 16 13:40:51 compute-1 ovn_controller[96285]: 2026-02-16T13:40:51Z|00132|binding|INFO|Setting lport 3dd3d50b-ad63-4bee-b823-c23750e7afc1 down in Southbound
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.851 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:51 compute-1 ovn_controller[96285]: 2026-02-16T13:40:51Z|00133|binding|INFO|Removing iface tap3dd3d50b-ad ovn-installed in OVS
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.857 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.872 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:e6:11 10.100.0.8'], port_security=['fa:16:3e:dd:e6:11 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '93d211b1-f197-4c96-a994-900df3bf28e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '13', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=3dd3d50b-ad63-4bee-b823-c23750e7afc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.873 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 3dd3d50b-ad63-4bee-b823-c23750e7afc1 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.874 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:40:51 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000010.scope: Deactivated successfully.
Feb 16 13:40:51 compute-1 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000010.scope: Consumed 1.649s CPU time.
Feb 16 13:40:51 compute-1 systemd-machined[155419]: Machine qemu-11-instance-00000010 terminated.
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.893 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8a16c34f-9baf-48b7-b769-978012c7acb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.922 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[c94a8f2e-48c8-4e29-bc1f-35d659fce6f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.925 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[16f23387-81ea-4d2e-982a-14f0517209d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.953 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[3e884e27-215f-4c2e-a7df-9252f4c713f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.971 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5177e9fe-fd47-41f7-84aa-5e0a4ad0804a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525953, 'reachable_time': 36519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212501, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.990 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc69412-21eb-421f-a298-3278d595948b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525961, 'tstamp': 525961}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212502, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap62a1ccdd-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525962, 'tstamp': 525962}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212502, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:51.992 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:51 compute-1 nova_compute[185910]: 2026-02-16 13:40:51.994 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.000 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:52.001 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:52.002 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:40:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:52.002 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:52.002 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.088 185914 INFO nova.virt.libvirt.driver [-] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Instance destroyed successfully.
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.089 185914 DEBUG nova.objects.instance [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 93d211b1-f197-4c96-a994-900df3bf28e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.106 185914 DEBUG nova.virt.libvirt.vif [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:39:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-804254472',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-804254472',id=16,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:39:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-c0ijme3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:40:44Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=93d211b1-f197-4c96-a994-900df3bf28e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.107 185914 DEBUG nova.network.os_vif_util [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "address": "fa:16:3e:dd:e6:11", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dd3d50b-ad", "ovs_interfaceid": "3dd3d50b-ad63-4bee-b823-c23750e7afc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.108 185914 DEBUG nova.network.os_vif_util [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.108 185914 DEBUG os_vif [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.111 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.112 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3dd3d50b-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.114 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.116 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.120 185914 INFO os_vif [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:e6:11,bridge_name='br-int',has_traffic_filtering=True,id=3dd3d50b-ad63-4bee-b823-c23750e7afc1,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dd3d50b-ad')
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.121 185914 INFO nova.virt.libvirt.driver [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Deleting instance files /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4_del
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.122 185914 INFO nova.virt.libvirt.driver [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Deletion of /var/lib/nova/instances/93d211b1-f197-4c96-a994-900df3bf28e4_del complete
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.209 185914 INFO nova.compute.manager [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Took 0.39 seconds to destroy the instance on the hypervisor.
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.210 185914 DEBUG oslo.service.loopingcall [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.211 185914 DEBUG nova.compute.manager [-] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.211 185914 DEBUG nova.network.neutron [-] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.896 185914 DEBUG nova.compute.manager [req-e134eb1a-f1e6-4a51-ac31-859e337f3221 req-13341309-270b-4680-9a1d-197c9413a91c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.896 185914 DEBUG oslo_concurrency.lockutils [req-e134eb1a-f1e6-4a51-ac31-859e337f3221 req-13341309-270b-4680-9a1d-197c9413a91c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.897 185914 DEBUG oslo_concurrency.lockutils [req-e134eb1a-f1e6-4a51-ac31-859e337f3221 req-13341309-270b-4680-9a1d-197c9413a91c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.897 185914 DEBUG oslo_concurrency.lockutils [req-e134eb1a-f1e6-4a51-ac31-859e337f3221 req-13341309-270b-4680-9a1d-197c9413a91c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.897 185914 DEBUG nova.compute.manager [req-e134eb1a-f1e6-4a51-ac31-859e337f3221 req-13341309-270b-4680-9a1d-197c9413a91c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:52 compute-1 nova_compute[185910]: 2026-02-16 13:40:52.897 185914 DEBUG nova.compute.manager [req-e134eb1a-f1e6-4a51-ac31-859e337f3221 req-13341309-270b-4680-9a1d-197c9413a91c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-unplugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:40:53 compute-1 podman[212521]: 2026-02-16 13:40:53.906574254 +0000 UTC m=+0.044748424 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.7, managed_by=edpm_ansible, config_id=openstack_network_exporter)
Feb 16 13:40:53 compute-1 podman[212522]: 2026-02-16 13:40:53.931205066 +0000 UTC m=+0.070737202 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 16 13:40:54 compute-1 nova_compute[185910]: 2026-02-16 13:40:54.090 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:54 compute-1 nova_compute[185910]: 2026-02-16 13:40:54.826 185914 DEBUG nova.network.neutron [-] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:40:54 compute-1 nova_compute[185910]: 2026-02-16 13:40:54.847 185914 INFO nova.compute.manager [-] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Took 2.64 seconds to deallocate network for instance.
Feb 16 13:40:54 compute-1 nova_compute[185910]: 2026-02-16 13:40:54.903 185914 DEBUG oslo_concurrency.lockutils [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:54 compute-1 nova_compute[185910]: 2026-02-16 13:40:54.904 185914 DEBUG oslo_concurrency.lockutils [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:54 compute-1 nova_compute[185910]: 2026-02-16 13:40:54.910 185914 DEBUG oslo_concurrency.lockutils [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:54 compute-1 nova_compute[185910]: 2026-02-16 13:40:54.970 185914 INFO nova.scheduler.client.report [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 93d211b1-f197-4c96-a994-900df3bf28e4
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.037 185914 DEBUG nova.compute.manager [req-008f1270-4420-47ec-af07-1a48ec0eba6d req-b427b371-a73b-46bb-99e7-1d585392a7da faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.038 185914 DEBUG oslo_concurrency.lockutils [req-008f1270-4420-47ec-af07-1a48ec0eba6d req-b427b371-a73b-46bb-99e7-1d585392a7da faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.039 185914 DEBUG oslo_concurrency.lockutils [req-008f1270-4420-47ec-af07-1a48ec0eba6d req-b427b371-a73b-46bb-99e7-1d585392a7da faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.039 185914 DEBUG oslo_concurrency.lockutils [req-008f1270-4420-47ec-af07-1a48ec0eba6d req-b427b371-a73b-46bb-99e7-1d585392a7da faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.040 185914 DEBUG nova.compute.manager [req-008f1270-4420-47ec-af07-1a48ec0eba6d req-b427b371-a73b-46bb-99e7-1d585392a7da faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] No waiting events found dispatching network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.040 185914 WARNING nova.compute.manager [req-008f1270-4420-47ec-af07-1a48ec0eba6d req-b427b371-a73b-46bb-99e7-1d585392a7da faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received unexpected event network-vif-plugged-3dd3d50b-ad63-4bee-b823-c23750e7afc1 for instance with vm_state deleted and task_state None.
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.041 185914 DEBUG nova.compute.manager [req-008f1270-4420-47ec-af07-1a48ec0eba6d req-b427b371-a73b-46bb-99e7-1d585392a7da faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Received event network-vif-deleted-3dd3d50b-ad63-4bee-b823-c23750e7afc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.097 185914 DEBUG oslo_concurrency.lockutils [None req-20c34b37-f716-4bb4-8be7-47c4aaff2728 e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "93d211b1-f197-4c96-a994-900df3bf28e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.399 185914 DEBUG oslo_concurrency.lockutils [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "70158849-73fd-43e5-a303-7507eec3bf57" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.400 185914 DEBUG oslo_concurrency.lockutils [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.401 185914 DEBUG oslo_concurrency.lockutils [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "70158849-73fd-43e5-a303-7507eec3bf57-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.401 185914 DEBUG oslo_concurrency.lockutils [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.401 185914 DEBUG oslo_concurrency.lockutils [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.402 185914 INFO nova.compute.manager [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Terminating instance
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.404 185914 DEBUG nova.compute.manager [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:40:55 compute-1 kernel: tap4d097b85-b9 (unregistering): left promiscuous mode
Feb 16 13:40:55 compute-1 NetworkManager[56388]: <info>  [1771249255.4309] device (tap4d097b85-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.436 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:55 compute-1 ovn_controller[96285]: 2026-02-16T13:40:55Z|00134|binding|INFO|Releasing lport 4d097b85-b92a-43e4-abbe-66dbafacee3c from this chassis (sb_readonly=0)
Feb 16 13:40:55 compute-1 ovn_controller[96285]: 2026-02-16T13:40:55Z|00135|binding|INFO|Setting lport 4d097b85-b92a-43e4-abbe-66dbafacee3c down in Southbound
Feb 16 13:40:55 compute-1 ovn_controller[96285]: 2026-02-16T13:40:55Z|00136|binding|INFO|Removing iface tap4d097b85-b9 ovn-installed in OVS
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.438 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.450 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:70:d3 10.100.0.12'], port_security=['fa:16:3e:aa:70:d3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '70158849-73fd-43e5-a303-7507eec3bf57', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '4', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=4d097b85-b92a-43e4-abbe-66dbafacee3c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.451 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.453 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 4d097b85-b92a-43e4-abbe-66dbafacee3c in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.455 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.456 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8244a26a-b9ba-4ffa-a639-eb97b721908a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.458 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:40:55 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Feb 16 13:40:55 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Consumed 16.032s CPU time.
Feb 16 13:40:55 compute-1 systemd-machined[155419]: Machine qemu-10-instance-0000000f terminated.
Feb 16 13:40:55 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212128]: [NOTICE]   (212132) : haproxy version is 2.8.14-c23fe91
Feb 16 13:40:55 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212128]: [NOTICE]   (212132) : path to executable is /usr/sbin/haproxy
Feb 16 13:40:55 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212128]: [WARNING]  (212132) : Exiting Master process...
Feb 16 13:40:55 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212128]: [ALERT]    (212132) : Current worker (212134) exited with code 143 (Terminated)
Feb 16 13:40:55 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212128]: [WARNING]  (212132) : All workers exited. Exiting... (0)
Feb 16 13:40:55 compute-1 systemd[1]: libpod-9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a.scope: Deactivated successfully.
Feb 16 13:40:55 compute-1 podman[212582]: 2026-02-16 13:40:55.610573797 +0000 UTC m=+0.048602187 container died 9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.624 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:55 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a-userdata-shm.mount: Deactivated successfully.
Feb 16 13:40:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-bccf6bf6fbce8d1a6019c94c34a126818264978283546a296450902b29fb5331-merged.mount: Deactivated successfully.
Feb 16 13:40:55 compute-1 podman[212582]: 2026-02-16 13:40:55.648416454 +0000 UTC m=+0.086444864 container cleanup 9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.656 185914 INFO nova.virt.libvirt.driver [-] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Instance destroyed successfully.
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.657 185914 DEBUG nova.objects.instance [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'resources' on Instance uuid 70158849-73fd-43e5-a303-7507eec3bf57 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:40:55 compute-1 systemd[1]: libpod-conmon-9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a.scope: Deactivated successfully.
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.677 185914 DEBUG nova.virt.libvirt.vif [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:39:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1319434423',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1319434423',id=15,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:39:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4paisd94',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:39:36Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=70158849-73fd-43e5-a303-7507eec3bf57,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.677 185914 DEBUG nova.network.os_vif_util [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "address": "fa:16:3e:aa:70:d3", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d097b85-b9", "ovs_interfaceid": "4d097b85-b92a-43e4-abbe-66dbafacee3c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.678 185914 DEBUG nova.network.os_vif_util [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:aa:70:d3,bridge_name='br-int',has_traffic_filtering=True,id=4d097b85-b92a-43e4-abbe-66dbafacee3c,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d097b85-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.678 185914 DEBUG os_vif [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:70:d3,bridge_name='br-int',has_traffic_filtering=True,id=4d097b85-b92a-43e4-abbe-66dbafacee3c,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d097b85-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.680 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.680 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d097b85-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.681 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.684 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.686 185914 INFO os_vif [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:70:d3,bridge_name='br-int',has_traffic_filtering=True,id=4d097b85-b92a-43e4-abbe-66dbafacee3c,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d097b85-b9')
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.686 185914 INFO nova.virt.libvirt.driver [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Deleting instance files /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57_del
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.687 185914 INFO nova.virt.libvirt.driver [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Deletion of /var/lib/nova/instances/70158849-73fd-43e5-a303-7507eec3bf57_del complete
Feb 16 13:40:55 compute-1 podman[212629]: 2026-02-16 13:40:55.708900229 +0000 UTC m=+0.040719205 container remove 9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.712 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f7eba9-cea3-47a5-abfb-d10ee4d6b548]: (4, ('Mon Feb 16 01:40:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a)\n9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a\nMon Feb 16 01:40:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a)\n9b5b70fab1d78f3ec89b2214cad4a318e93e678abc962781e44d85f550af883a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.714 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0c1f53-1316-436b-a8ed-b7b046c0d971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.715 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.717 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:55 compute-1 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.719 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.721 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[52c6fdf3-255a-447b-843d-8f321a7722c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.726 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.734 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[fe744189-76a9-437f-8226-c970c250fa39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.735 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[4d1be7a6-6b28-4143-9a39-35bf6f88c963]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.745 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[b093b574-3978-49a4-9ffc-e48a60fc61de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525947, 'reachable_time': 44781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212644, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.749 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:40:55 compute-1 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:40:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:55.749 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[529e5d1a-60f6-4daa-9b73-dc6c8db9e201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.764 185914 INFO nova.compute.manager [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.764 185914 DEBUG oslo.service.loopingcall [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.764 185914 DEBUG nova.compute.manager [-] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.765 185914 DEBUG nova.network.neutron [-] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.879 185914 DEBUG nova.compute.manager [req-03313abf-4f84-474d-8fd1-8c4acccdb87f req-ed419992-76a5-4c8a-a338-9269e382c31c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Received event network-vif-unplugged-4d097b85-b92a-43e4-abbe-66dbafacee3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.880 185914 DEBUG oslo_concurrency.lockutils [req-03313abf-4f84-474d-8fd1-8c4acccdb87f req-ed419992-76a5-4c8a-a338-9269e382c31c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "70158849-73fd-43e5-a303-7507eec3bf57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.880 185914 DEBUG oslo_concurrency.lockutils [req-03313abf-4f84-474d-8fd1-8c4acccdb87f req-ed419992-76a5-4c8a-a338-9269e382c31c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.880 185914 DEBUG oslo_concurrency.lockutils [req-03313abf-4f84-474d-8fd1-8c4acccdb87f req-ed419992-76a5-4c8a-a338-9269e382c31c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.881 185914 DEBUG nova.compute.manager [req-03313abf-4f84-474d-8fd1-8c4acccdb87f req-ed419992-76a5-4c8a-a338-9269e382c31c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] No waiting events found dispatching network-vif-unplugged-4d097b85-b92a-43e4-abbe-66dbafacee3c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:55 compute-1 nova_compute[185910]: 2026-02-16 13:40:55.881 185914 DEBUG nova.compute.manager [req-03313abf-4f84-474d-8fd1-8c4acccdb87f req-ed419992-76a5-4c8a-a338-9269e382c31c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Received event network-vif-unplugged-4d097b85-b92a-43e4-abbe-66dbafacee3c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:40:56 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:56.022 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:40:56 compute-1 nova_compute[185910]: 2026-02-16 13:40:56.022 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:40:56 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:40:56.024 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.435 185914 DEBUG nova.network.neutron [-] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.463 185914 INFO nova.compute.manager [-] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Took 1.70 seconds to deallocate network for instance.
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.518 185914 DEBUG oslo_concurrency.lockutils [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.518 185914 DEBUG oslo_concurrency.lockutils [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.587 185914 DEBUG nova.compute.provider_tree [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.610 185914 DEBUG nova.scheduler.client.report [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.635 185914 DEBUG nova.compute.manager [req-730fb98a-587f-4797-96e0-5e5ee25dbed0 req-f4a4f4b4-dd1c-40b6-9058-1468ec3ff89b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Received event network-vif-deleted-4d097b85-b92a-43e4-abbe-66dbafacee3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.671 185914 DEBUG oslo_concurrency.lockutils [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.728 185914 INFO nova.scheduler.client.report [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Deleted allocations for instance 70158849-73fd-43e5-a303-7507eec3bf57
Feb 16 13:40:57 compute-1 nova_compute[185910]: 2026-02-16 13:40:57.915 185914 DEBUG oslo_concurrency.lockutils [None req-c68f619f-44b2-4653-9119-be9e6d780a9b e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:58 compute-1 nova_compute[185910]: 2026-02-16 13:40:58.038 185914 DEBUG nova.compute.manager [req-9844cfd8-7a1d-4492-85f4-9af7df68e622 req-71ad8ae4-3d5d-4a60-afbe-2f13a7d8b4e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Received event network-vif-plugged-4d097b85-b92a-43e4-abbe-66dbafacee3c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:40:58 compute-1 nova_compute[185910]: 2026-02-16 13:40:58.039 185914 DEBUG oslo_concurrency.lockutils [req-9844cfd8-7a1d-4492-85f4-9af7df68e622 req-71ad8ae4-3d5d-4a60-afbe-2f13a7d8b4e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "70158849-73fd-43e5-a303-7507eec3bf57-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:40:58 compute-1 nova_compute[185910]: 2026-02-16 13:40:58.039 185914 DEBUG oslo_concurrency.lockutils [req-9844cfd8-7a1d-4492-85f4-9af7df68e622 req-71ad8ae4-3d5d-4a60-afbe-2f13a7d8b4e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:40:58 compute-1 nova_compute[185910]: 2026-02-16 13:40:58.040 185914 DEBUG oslo_concurrency.lockutils [req-9844cfd8-7a1d-4492-85f4-9af7df68e622 req-71ad8ae4-3d5d-4a60-afbe-2f13a7d8b4e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "70158849-73fd-43e5-a303-7507eec3bf57-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:40:58 compute-1 nova_compute[185910]: 2026-02-16 13:40:58.040 185914 DEBUG nova.compute.manager [req-9844cfd8-7a1d-4492-85f4-9af7df68e622 req-71ad8ae4-3d5d-4a60-afbe-2f13a7d8b4e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] No waiting events found dispatching network-vif-plugged-4d097b85-b92a-43e4-abbe-66dbafacee3c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:40:58 compute-1 nova_compute[185910]: 2026-02-16 13:40:58.041 185914 WARNING nova.compute.manager [req-9844cfd8-7a1d-4492-85f4-9af7df68e622 req-71ad8ae4-3d5d-4a60-afbe-2f13a7d8b4e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Received unexpected event network-vif-plugged-4d097b85-b92a-43e4-abbe-66dbafacee3c for instance with vm_state deleted and task_state None.
Feb 16 13:40:59 compute-1 nova_compute[185910]: 2026-02-16 13:40:59.139 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:00 compute-1 nova_compute[185910]: 2026-02-16 13:41:00.683 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:01 compute-1 sshd-session[212645]: Invalid user raydium from 2.57.122.210 port 35196
Feb 16 13:41:01 compute-1 podman[212647]: 2026-02-16 13:41:01.519577079 +0000 UTC m=+0.101645743 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:41:01 compute-1 sshd-session[212645]: Connection closed by invalid user raydium 2.57.122.210 port 35196 [preauth]
Feb 16 13:41:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:03.349 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:03.349 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:03.350 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:04 compute-1 nova_compute[185910]: 2026-02-16 13:41:04.141 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:05 compute-1 podman[195236]: time="2026-02-16T13:41:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:41:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:41:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:41:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:41:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 16 13:41:05 compute-1 nova_compute[185910]: 2026-02-16 13:41:05.686 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:06 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:06.027 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:07 compute-1 nova_compute[185910]: 2026-02-16 13:41:07.086 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249252.0851336, 93d211b1-f197-4c96-a994-900df3bf28e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:41:07 compute-1 nova_compute[185910]: 2026-02-16 13:41:07.086 185914 INFO nova.compute.manager [-] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] VM Stopped (Lifecycle Event)
Feb 16 13:41:07 compute-1 nova_compute[185910]: 2026-02-16 13:41:07.114 185914 DEBUG nova.compute.manager [None req-efe7dcf3-b1f1-4d6d-98fc-0245e62d2eda - - - - - -] [instance: 93d211b1-f197-4c96-a994-900df3bf28e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:09 compute-1 nova_compute[185910]: 2026-02-16 13:41:09.143 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:09 compute-1 podman[212674]: 2026-02-16 13:41:09.92463531 +0000 UTC m=+0.066190439 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:41:10 compute-1 nova_compute[185910]: 2026-02-16 13:41:10.655 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249255.6542914, 70158849-73fd-43e5-a303-7507eec3bf57 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:41:10 compute-1 nova_compute[185910]: 2026-02-16 13:41:10.656 185914 INFO nova.compute.manager [-] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] VM Stopped (Lifecycle Event)
Feb 16 13:41:10 compute-1 nova_compute[185910]: 2026-02-16 13:41:10.680 185914 DEBUG nova.compute.manager [None req-e18e62e8-1c13-4059-9d70-b8b567ccbced - - - - - -] [instance: 70158849-73fd-43e5-a303-7507eec3bf57] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:10 compute-1 nova_compute[185910]: 2026-02-16 13:41:10.691 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:14 compute-1 nova_compute[185910]: 2026-02-16 13:41:14.145 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:15 compute-1 nova_compute[185910]: 2026-02-16 13:41:15.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:15 compute-1 nova_compute[185910]: 2026-02-16 13:41:15.695 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:16 compute-1 sshd-session[212700]: Invalid user postgres from 188.166.42.159 port 41272
Feb 16 13:41:16 compute-1 sshd-session[212700]: Connection closed by invalid user postgres 188.166.42.159 port 41272 [preauth]
Feb 16 13:41:17 compute-1 nova_compute[185910]: 2026-02-16 13:41:17.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:18 compute-1 nova_compute[185910]: 2026-02-16 13:41:18.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:19 compute-1 nova_compute[185910]: 2026-02-16 13:41:19.146 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:19 compute-1 openstack_network_exporter[198096]: ERROR   13:41:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:41:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:41:19 compute-1 openstack_network_exporter[198096]: ERROR   13:41:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:41:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.669 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.669 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.670 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.670 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.698 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.858 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.860 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5823MB free_disk=73.22364044189453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.860 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:20 compute-1 nova_compute[185910]: 2026-02-16 13:41:20.861 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.073 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.073 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.148 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.169 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.170 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.191 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.213 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.252 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.275 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.297 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:41:21 compute-1 nova_compute[185910]: 2026-02-16 13:41:21.297 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:24 compute-1 nova_compute[185910]: 2026-02-16 13:41:24.148 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:24 compute-1 podman[212704]: 2026-02-16 13:41:24.919280264 +0000 UTC m=+0.054144136 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 16 13:41:24 compute-1 podman[212703]: 2026-02-16 13:41:24.936006894 +0000 UTC m=+0.071513613 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1770267347, version=9.7, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64)
Feb 16 13:41:25 compute-1 nova_compute[185910]: 2026-02-16 13:41:25.297 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:25 compute-1 nova_compute[185910]: 2026-02-16 13:41:25.298 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:41:25 compute-1 nova_compute[185910]: 2026-02-16 13:41:25.298 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:41:25 compute-1 nova_compute[185910]: 2026-02-16 13:41:25.323 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:41:25 compute-1 nova_compute[185910]: 2026-02-16 13:41:25.652 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:25 compute-1 nova_compute[185910]: 2026-02-16 13:41:25.703 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:27 compute-1 nova_compute[185910]: 2026-02-16 13:41:27.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:27 compute-1 nova_compute[185910]: 2026-02-16 13:41:27.677 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:41:27 compute-1 nova_compute[185910]: 2026-02-16 13:41:27.678 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:41:29 compute-1 nova_compute[185910]: 2026-02-16 13:41:29.151 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:30 compute-1 nova_compute[185910]: 2026-02-16 13:41:30.706 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:31 compute-1 podman[212741]: 2026-02-16 13:41:31.925226015 +0000 UTC m=+0.069932730 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 16 13:41:34 compute-1 nova_compute[185910]: 2026-02-16 13:41:34.154 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:35 compute-1 podman[195236]: time="2026-02-16T13:41:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:41:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:41:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:41:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:41:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:41:35 compute-1 nova_compute[185910]: 2026-02-16 13:41:35.709 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:37 compute-1 sshd-session[212769]: Invalid user admin from 146.190.226.24 port 52136
Feb 16 13:41:37 compute-1 sshd-session[212769]: Connection closed by invalid user admin 146.190.226.24 port 52136 [preauth]
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.156 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.198 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.198 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.215 185914 DEBUG nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.313 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.314 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.326 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.326 185914 INFO nova.compute.claims [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.428 185914 DEBUG nova.compute.provider_tree [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.443 185914 DEBUG nova.scheduler.client.report [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.465 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.466 185914 DEBUG nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.510 185914 DEBUG nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.511 185914 DEBUG nova.network.neutron [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.534 185914 INFO nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.551 185914 DEBUG nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.640 185914 DEBUG nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.641 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.642 185914 INFO nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Creating image(s)
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.642 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.643 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.643 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.656 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.704 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.705 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.706 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.716 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.773 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.774 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.801 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.802 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.803 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.848 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.849 185914 DEBUG nova.virt.disk.api [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.850 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.894 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.895 185914 DEBUG nova.virt.disk.api [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.895 185914 DEBUG nova.objects.instance [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid ed0f983d-6cd6-429c-8af1-0d52a56731d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.911 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.912 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Ensure instance console log exists: /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.912 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.912 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:39 compute-1 nova_compute[185910]: 2026-02-16 13:41:39.913 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:40 compute-1 nova_compute[185910]: 2026-02-16 13:41:40.552 185914 DEBUG nova.policy [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:41:40 compute-1 nova_compute[185910]: 2026-02-16 13:41:40.713 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:40 compute-1 podman[212786]: 2026-02-16 13:41:40.932067547 +0000 UTC m=+0.069857498 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:41:42 compute-1 nova_compute[185910]: 2026-02-16 13:41:42.124 185914 DEBUG nova.network.neutron [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Successfully created port: c9816814-5dfa-4f80-812c-4fc20a800a47 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:41:42 compute-1 nova_compute[185910]: 2026-02-16 13:41:42.780 185914 DEBUG nova.network.neutron [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Successfully updated port: c9816814-5dfa-4f80-812c-4fc20a800a47 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:41:42 compute-1 nova_compute[185910]: 2026-02-16 13:41:42.800 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:41:42 compute-1 nova_compute[185910]: 2026-02-16 13:41:42.800 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:41:42 compute-1 nova_compute[185910]: 2026-02-16 13:41:42.801 185914 DEBUG nova.network.neutron [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:41:42 compute-1 nova_compute[185910]: 2026-02-16 13:41:42.881 185914 DEBUG nova.compute.manager [req-174fa51b-aaf6-45e4-9a86-d57888873904 req-b95019fa-a716-4827-bf11-af76746b54ba faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-changed-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:41:42 compute-1 nova_compute[185910]: 2026-02-16 13:41:42.882 185914 DEBUG nova.compute.manager [req-174fa51b-aaf6-45e4-9a86-d57888873904 req-b95019fa-a716-4827-bf11-af76746b54ba faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Refreshing instance network info cache due to event network-changed-c9816814-5dfa-4f80-812c-4fc20a800a47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:41:42 compute-1 nova_compute[185910]: 2026-02-16 13:41:42.882 185914 DEBUG oslo_concurrency.lockutils [req-174fa51b-aaf6-45e4-9a86-d57888873904 req-b95019fa-a716-4827-bf11-af76746b54ba faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:41:43 compute-1 nova_compute[185910]: 2026-02-16 13:41:43.745 185914 DEBUG nova.network.neutron [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.158 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.949 185914 DEBUG nova.network.neutron [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updating instance_info_cache with network_info: [{"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.972 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.972 185914 DEBUG nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Instance network_info: |[{"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.972 185914 DEBUG oslo_concurrency.lockutils [req-174fa51b-aaf6-45e4-9a86-d57888873904 req-b95019fa-a716-4827-bf11-af76746b54ba faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.973 185914 DEBUG nova.network.neutron [req-174fa51b-aaf6-45e4-9a86-d57888873904 req-b95019fa-a716-4827-bf11-af76746b54ba faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Refreshing network info cache for port c9816814-5dfa-4f80-812c-4fc20a800a47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.975 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Start _get_guest_xml network_info=[{"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.979 185914 WARNING nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.984 185914 DEBUG nova.virt.libvirt.host [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.984 185914 DEBUG nova.virt.libvirt.host [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.991 185914 DEBUG nova.virt.libvirt.host [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.992 185914 DEBUG nova.virt.libvirt.host [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.993 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.994 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.994 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.994 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.995 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.995 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.995 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.995 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.995 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.996 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.996 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.996 185914 DEBUG nova.virt.hardware [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.999 185914 DEBUG nova.virt.libvirt.vif [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:41:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1167094500',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1167094500',id=18,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-r5iyo2b2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:41:39Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=ed0f983d-6cd6-429c-8af1-0d52a56731d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:41:44 compute-1 nova_compute[185910]: 2026-02-16 13:41:44.999 185914 DEBUG nova.network.os_vif_util [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.000 185914 DEBUG nova.network.os_vif_util [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.001 185914 DEBUG nova.objects.instance [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid ed0f983d-6cd6-429c-8af1-0d52a56731d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.017 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <uuid>ed0f983d-6cd6-429c-8af1-0d52a56731d6</uuid>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <name>instance-00000012</name>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteStrategies-server-1167094500</nova:name>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:41:44</nova:creationTime>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:41:45 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:41:45 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:41:45 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:41:45 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:41:45 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:41:45 compute-1 nova_compute[185910]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:41:45 compute-1 nova_compute[185910]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:41:45 compute-1 nova_compute[185910]:         <nova:port uuid="c9816814-5dfa-4f80-812c-4fc20a800a47">
Feb 16 13:41:45 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <system>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <entry name="serial">ed0f983d-6cd6-429c-8af1-0d52a56731d6</entry>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <entry name="uuid">ed0f983d-6cd6-429c-8af1-0d52a56731d6</entry>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     </system>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <os>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   </os>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <features>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   </features>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:b7:0e:aa"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <target dev="tapc9816814-5d"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/console.log" append="off"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <video>
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     </video>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:41:45 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:41:45 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:41:45 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:41:45 compute-1 nova_compute[185910]: </domain>
Feb 16 13:41:45 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.018 185914 DEBUG nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Preparing to wait for external event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.018 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.018 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.018 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.019 185914 DEBUG nova.virt.libvirt.vif [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:41:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1167094500',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1167094500',id=18,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-r5iyo2b2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:41:39Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=ed0f983d-6cd6-429c-8af1-0d52a56731d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.020 185914 DEBUG nova.network.os_vif_util [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.020 185914 DEBUG nova.network.os_vif_util [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.021 185914 DEBUG os_vif [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.021 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.022 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.022 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.025 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.025 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9816814-5d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.026 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9816814-5d, col_values=(('external_ids', {'iface-id': 'c9816814-5dfa-4f80-812c-4fc20a800a47', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:0e:aa', 'vm-uuid': 'ed0f983d-6cd6-429c-8af1-0d52a56731d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.027 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:45 compute-1 NetworkManager[56388]: <info>  [1771249305.0288] manager: (tapc9816814-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.030 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.033 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.034 185914 INFO os_vif [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d')
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.080 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.080 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.081 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:b7:0e:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.081 185914 INFO nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Using config drive
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.766 185914 INFO nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Creating config drive at /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.770 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpmx68q3rh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.896 185914 DEBUG oslo_concurrency.processutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpmx68q3rh" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:41:45 compute-1 kernel: tapc9816814-5d: entered promiscuous mode
Feb 16 13:41:45 compute-1 NetworkManager[56388]: <info>  [1771249305.9496] manager: (tapc9816814-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Feb 16 13:41:45 compute-1 ovn_controller[96285]: 2026-02-16T13:41:45Z|00137|binding|INFO|Claiming lport c9816814-5dfa-4f80-812c-4fc20a800a47 for this chassis.
Feb 16 13:41:45 compute-1 ovn_controller[96285]: 2026-02-16T13:41:45Z|00138|binding|INFO|c9816814-5dfa-4f80-812c-4fc20a800a47: Claiming fa:16:3e:b7:0e:aa 10.100.0.13
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.949 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:45 compute-1 ovn_controller[96285]: 2026-02-16T13:41:45Z|00139|binding|INFO|Setting lport c9816814-5dfa-4f80-812c-4fc20a800a47 ovn-installed in OVS
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.957 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:45 compute-1 ovn_controller[96285]: 2026-02-16T13:41:45Z|00140|binding|INFO|Setting lport c9816814-5dfa-4f80-812c-4fc20a800a47 up in Southbound
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.958 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:0e:aa 10.100.0.13'], port_security=['fa:16:3e:b7:0e:aa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ed0f983d-6cd6-429c-8af1-0d52a56731d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=c9816814-5dfa-4f80-812c-4fc20a800a47) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.958 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:45 compute-1 nova_compute[185910]: 2026-02-16 13:41:45.959 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.959 105573 INFO neutron.agent.ovn.metadata.agent [-] Port c9816814-5dfa-4f80-812c-4fc20a800a47 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.960 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.969 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f730af52-87ff-4ef0-b2b5-0048dbf77de5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.970 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.972 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.973 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0d69c4-3513-4901-a05d-7d1582b563f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.973 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e50628a5-941f-4713-8773-1db1f3bc76f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:45 compute-1 systemd-machined[155419]: New machine qemu-12-instance-00000012.
Feb 16 13:41:45 compute-1 systemd-udevd[212833]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.982 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[071515c1-d83e-48ab-ae22-d2eec76db911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:45.992 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[22d166f4-0545-4e8e-a832-f72e692d1d25]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:45 compute-1 NetworkManager[56388]: <info>  [1771249305.9948] device (tapc9816814-5d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:41:45 compute-1 NetworkManager[56388]: <info>  [1771249305.9954] device (tapc9816814-5d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:41:45 compute-1 systemd[1]: Started Virtual Machine qemu-12-instance-00000012.
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.015 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[52682883-1a16-4d36-b601-e90f680b865a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 NetworkManager[56388]: <info>  [1771249306.0212] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.020 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[821e2c4a-437d-429c-a317-ff84af4f65d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.043 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c8652b-ed2f-4c9a-b6af-6475a09adf2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.047 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[8048387d-3553-4345-b6a3-09eb67d8f276]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 NetworkManager[56388]: <info>  [1771249306.0690] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.074 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[40267518-33d3-4d6d-a744-0a236669756d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.093 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e9ef8f05-6d4d-455e-aec8-e4343f614992]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539048, 'reachable_time': 37805, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212864, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.109 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f34a5551-ab3d-4e1b-9409-a7551f50cb55]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 539048, 'tstamp': 539048}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212865, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.125 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5c385d40-f412-440b-a762-162fde4958f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539048, 'reachable_time': 37805, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212866, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.148 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[aff120cc-de1a-45a7-8101-934ed280b844]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.195 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c1cf38f5-25d3-4599-b755-06fcd8aa4642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.197 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.197 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.197 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:46 compute-1 NetworkManager[56388]: <info>  [1771249306.2458] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Feb 16 13:41:46 compute-1 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:41:46 compute-1 nova_compute[185910]: 2026-02-16 13:41:46.251 185914 DEBUG nova.compute.manager [req-3b8e9e2f-8563-4c6e-a615-614adc542f3c req-872d2fec-a150-48ac-a8b3-e8118b83598d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:41:46 compute-1 nova_compute[185910]: 2026-02-16 13:41:46.251 185914 DEBUG oslo_concurrency.lockutils [req-3b8e9e2f-8563-4c6e-a615-614adc542f3c req-872d2fec-a150-48ac-a8b3-e8118b83598d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:46 compute-1 nova_compute[185910]: 2026-02-16 13:41:46.251 185914 DEBUG oslo_concurrency.lockutils [req-3b8e9e2f-8563-4c6e-a615-614adc542f3c req-872d2fec-a150-48ac-a8b3-e8118b83598d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:46 compute-1 nova_compute[185910]: 2026-02-16 13:41:46.252 185914 DEBUG oslo_concurrency.lockutils [req-3b8e9e2f-8563-4c6e-a615-614adc542f3c req-872d2fec-a150-48ac-a8b3-e8118b83598d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:46 compute-1 nova_compute[185910]: 2026-02-16 13:41:46.252 185914 DEBUG nova.compute.manager [req-3b8e9e2f-8563-4c6e-a615-614adc542f3c req-872d2fec-a150-48ac-a8b3-e8118b83598d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Processing event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:41:46 compute-1 nova_compute[185910]: 2026-02-16 13:41:46.252 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.258 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:41:46 compute-1 ovn_controller[96285]: 2026-02-16T13:41:46Z|00141|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:41:46 compute-1 nova_compute[185910]: 2026-02-16 13:41:46.259 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.267 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.268 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e66337e9-dc9d-4337-8258-a811698b1960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.270 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:41:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:41:46.273 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:41:46 compute-1 podman[212899]: 2026-02-16 13:41:46.630074602 +0000 UTC m=+0.054195158 container create 8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 16 13:41:46 compute-1 systemd[1]: Started libpod-conmon-8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c.scope.
Feb 16 13:41:46 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:41:46 compute-1 podman[212899]: 2026-02-16 13:41:46.601982367 +0000 UTC m=+0.026102963 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:41:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3357504af40a718f79764d2078e67f79fb874b2a624e00025dfc97f1c3a2059f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:41:46 compute-1 podman[212899]: 2026-02-16 13:41:46.71037795 +0000 UTC m=+0.134498506 container init 8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 16 13:41:46 compute-1 podman[212899]: 2026-02-16 13:41:46.714284945 +0000 UTC m=+0.138405501 container start 8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 16 13:41:46 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212914]: [NOTICE]   (212918) : New worker (212920) forked
Feb 16 13:41:46 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212914]: [NOTICE]   (212918) : Loading success.
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.024 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249307.023851, ed0f983d-6cd6-429c-8af1-0d52a56731d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.025 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] VM Started (Lifecycle Event)
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.028 185914 DEBUG nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.032 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.036 185914 INFO nova.virt.libvirt.driver [-] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Instance spawned successfully.
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.037 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.059 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.066 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.068 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.069 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.069 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.069 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.070 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.070 185914 DEBUG nova.virt.libvirt.driver [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.113 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.114 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249307.0281494, ed0f983d-6cd6-429c-8af1-0d52a56731d6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.114 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] VM Paused (Lifecycle Event)
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.148 185914 INFO nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Took 7.51 seconds to spawn the instance on the hypervisor.
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.148 185914 DEBUG nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.150 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.155 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249307.0310152, ed0f983d-6cd6-429c-8af1-0d52a56731d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.155 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] VM Resumed (Lifecycle Event)
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.185 185914 DEBUG nova.network.neutron [req-174fa51b-aaf6-45e4-9a86-d57888873904 req-b95019fa-a716-4827-bf11-af76746b54ba faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updated VIF entry in instance network info cache for port c9816814-5dfa-4f80-812c-4fc20a800a47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.186 185914 DEBUG nova.network.neutron [req-174fa51b-aaf6-45e4-9a86-d57888873904 req-b95019fa-a716-4827-bf11-af76746b54ba faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updating instance_info_cache with network_info: [{"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.196 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.198 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.227 185914 DEBUG oslo_concurrency.lockutils [req-174fa51b-aaf6-45e4-9a86-d57888873904 req-b95019fa-a716-4827-bf11-af76746b54ba faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.229 185914 INFO nova.compute.manager [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Took 7.95 seconds to build instance.
Feb 16 13:41:47 compute-1 nova_compute[185910]: 2026-02-16 13:41:47.243 185914 DEBUG oslo_concurrency.lockutils [None req-26e006b1-ef68-4393-9abd-a2976ba8f94e e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:48 compute-1 nova_compute[185910]: 2026-02-16 13:41:48.392 185914 DEBUG nova.compute.manager [req-9aa4f742-8866-46e7-a520-23e96110f285 req-68e51210-be72-422f-a404-e93c0a4aa0e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:41:48 compute-1 nova_compute[185910]: 2026-02-16 13:41:48.393 185914 DEBUG oslo_concurrency.lockutils [req-9aa4f742-8866-46e7-a520-23e96110f285 req-68e51210-be72-422f-a404-e93c0a4aa0e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:41:48 compute-1 nova_compute[185910]: 2026-02-16 13:41:48.393 185914 DEBUG oslo_concurrency.lockutils [req-9aa4f742-8866-46e7-a520-23e96110f285 req-68e51210-be72-422f-a404-e93c0a4aa0e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:41:48 compute-1 nova_compute[185910]: 2026-02-16 13:41:48.394 185914 DEBUG oslo_concurrency.lockutils [req-9aa4f742-8866-46e7-a520-23e96110f285 req-68e51210-be72-422f-a404-e93c0a4aa0e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:41:48 compute-1 nova_compute[185910]: 2026-02-16 13:41:48.394 185914 DEBUG nova.compute.manager [req-9aa4f742-8866-46e7-a520-23e96110f285 req-68e51210-be72-422f-a404-e93c0a4aa0e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:41:48 compute-1 nova_compute[185910]: 2026-02-16 13:41:48.394 185914 WARNING nova.compute.manager [req-9aa4f742-8866-46e7-a520-23e96110f285 req-68e51210-be72-422f-a404-e93c0a4aa0e2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received unexpected event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with vm_state active and task_state None.
Feb 16 13:41:49 compute-1 nova_compute[185910]: 2026-02-16 13:41:49.160 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:49 compute-1 openstack_network_exporter[198096]: ERROR   13:41:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:41:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:41:49 compute-1 openstack_network_exporter[198096]: ERROR   13:41:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:41:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:41:50 compute-1 nova_compute[185910]: 2026-02-16 13:41:50.028 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:54 compute-1 nova_compute[185910]: 2026-02-16 13:41:54.202 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:55 compute-1 nova_compute[185910]: 2026-02-16 13:41:55.030 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:55 compute-1 podman[212937]: 2026-02-16 13:41:55.960598562 +0000 UTC m=+0.071939664 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Feb 16 13:41:55 compute-1 podman[212936]: 2026-02-16 13:41:55.960022757 +0000 UTC m=+0.078038748 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, architecture=x86_64, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Feb 16 13:41:59 compute-1 nova_compute[185910]: 2026-02-16 13:41:59.204 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:41:59 compute-1 ovn_controller[96285]: 2026-02-16T13:41:59Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b7:0e:aa 10.100.0.13
Feb 16 13:41:59 compute-1 ovn_controller[96285]: 2026-02-16T13:41:59Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b7:0e:aa 10.100.0.13
Feb 16 13:42:00 compute-1 nova_compute[185910]: 2026-02-16 13:42:00.075 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:02 compute-1 podman[212987]: 2026-02-16 13:42:02.957202062 +0000 UTC m=+0.096051532 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:42:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:03.351 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:03.351 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:03.352 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:04 compute-1 nova_compute[185910]: 2026-02-16 13:42:04.207 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:05 compute-1 nova_compute[185910]: 2026-02-16 13:42:05.078 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:05 compute-1 podman[195236]: time="2026-02-16T13:42:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:42:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:42:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:42:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:42:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2643 "" "Go-http-client/1.1"
Feb 16 13:42:08 compute-1 sshd-session[213014]: Invalid user postgres from 188.166.42.159 port 51064
Feb 16 13:42:08 compute-1 sshd-session[213014]: Connection closed by invalid user postgres 188.166.42.159 port 51064 [preauth]
Feb 16 13:42:09 compute-1 nova_compute[185910]: 2026-02-16 13:42:09.239 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:10 compute-1 nova_compute[185910]: 2026-02-16 13:42:10.081 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:11 compute-1 podman[213016]: 2026-02-16 13:42:11.914814292 +0000 UTC m=+0.052174383 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:42:14 compute-1 nova_compute[185910]: 2026-02-16 13:42:14.242 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:15 compute-1 nova_compute[185910]: 2026-02-16 13:42:15.124 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:15 compute-1 ovn_controller[96285]: 2026-02-16T13:42:15Z|00142|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Feb 16 13:42:17 compute-1 nova_compute[185910]: 2026-02-16 13:42:17.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:17 compute-1 nova_compute[185910]: 2026-02-16 13:42:17.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:18 compute-1 nova_compute[185910]: 2026-02-16 13:42:18.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:19 compute-1 nova_compute[185910]: 2026-02-16 13:42:19.243 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:19 compute-1 openstack_network_exporter[198096]: ERROR   13:42:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:42:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:42:19 compute-1 openstack_network_exporter[198096]: ERROR   13:42:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:42:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:42:20 compute-1 nova_compute[185910]: 2026-02-16 13:42:20.186 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.667 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.668 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.668 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.668 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.762 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.819 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.821 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:21 compute-1 nova_compute[185910]: 2026-02-16 13:42:21.875 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.030 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.031 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5623MB free_disk=73.19474411010742GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.031 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.032 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.108 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance ed0f983d-6cd6-429c-8af1-0d52a56731d6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.109 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.109 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.188 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.203 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.224 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:42:22 compute-1 nova_compute[185910]: 2026-02-16 13:42:22.224 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:23 compute-1 nova_compute[185910]: 2026-02-16 13:42:23.225 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:24 compute-1 nova_compute[185910]: 2026-02-16 13:42:24.246 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:25 compute-1 nova_compute[185910]: 2026-02-16 13:42:25.224 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:26 compute-1 nova_compute[185910]: 2026-02-16 13:42:26.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:26 compute-1 nova_compute[185910]: 2026-02-16 13:42:26.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:26 compute-1 nova_compute[185910]: 2026-02-16 13:42:26.630 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:42:26 compute-1 nova_compute[185910]: 2026-02-16 13:42:26.630 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:42:26 compute-1 podman[213049]: 2026-02-16 13:42:26.922172316 +0000 UTC m=+0.059360096 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:42:26 compute-1 podman[213048]: 2026-02-16 13:42:26.924380486 +0000 UTC m=+0.061503834 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=)
Feb 16 13:42:27 compute-1 nova_compute[185910]: 2026-02-16 13:42:27.394 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:42:27 compute-1 nova_compute[185910]: 2026-02-16 13:42:27.395 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:42:27 compute-1 nova_compute[185910]: 2026-02-16 13:42:27.395 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:42:27 compute-1 nova_compute[185910]: 2026-02-16 13:42:27.395 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid ed0f983d-6cd6-429c-8af1-0d52a56731d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:42:29 compute-1 nova_compute[185910]: 2026-02-16 13:42:29.247 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:29 compute-1 nova_compute[185910]: 2026-02-16 13:42:29.536 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updating instance_info_cache with network_info: [{"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:42:29 compute-1 nova_compute[185910]: 2026-02-16 13:42:29.566 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:42:29 compute-1 nova_compute[185910]: 2026-02-16 13:42:29.567 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:42:29 compute-1 nova_compute[185910]: 2026-02-16 13:42:29.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:42:29 compute-1 nova_compute[185910]: 2026-02-16 13:42:29.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:42:30 compute-1 nova_compute[185910]: 2026-02-16 13:42:30.241 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:33 compute-1 nova_compute[185910]: 2026-02-16 13:42:33.076 185914 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Check if temp file /var/lib/nova/instances/tmpnq0a57mh exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:42:33 compute-1 nova_compute[185910]: 2026-02-16 13:42:33.077 185914 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnq0a57mh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ed0f983d-6cd6-429c-8af1-0d52a56731d6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:42:33 compute-1 podman[213090]: 2026-02-16 13:42:33.925798678 +0000 UTC m=+0.068114840 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 13:42:33 compute-1 nova_compute[185910]: 2026-02-16 13:42:33.974 185914 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:34 compute-1 nova_compute[185910]: 2026-02-16 13:42:34.022 185914 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:34 compute-1 nova_compute[185910]: 2026-02-16 13:42:34.024 185914 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:42:34 compute-1 nova_compute[185910]: 2026-02-16 13:42:34.071 185914 DEBUG oslo_concurrency.processutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:42:34 compute-1 nova_compute[185910]: 2026-02-16 13:42:34.250 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:35 compute-1 nova_compute[185910]: 2026-02-16 13:42:35.299 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:35 compute-1 podman[195236]: time="2026-02-16T13:42:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:42:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:42:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:42:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:42:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2642 "" "Go-http-client/1.1"
Feb 16 13:42:36 compute-1 sshd-session[213122]: Accepted publickey for nova from 192.168.122.100 port 45560 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:42:36 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:42:36 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:42:36 compute-1 systemd-logind[821]: New session 40 of user nova.
Feb 16 13:42:36 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:42:36 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:42:36 compute-1 systemd[213126]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:42:36 compute-1 systemd[213126]: Queued start job for default target Main User Target.
Feb 16 13:42:36 compute-1 systemd[213126]: Created slice User Application Slice.
Feb 16 13:42:36 compute-1 systemd[213126]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:42:36 compute-1 systemd[213126]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:42:36 compute-1 systemd[213126]: Reached target Paths.
Feb 16 13:42:36 compute-1 systemd[213126]: Reached target Timers.
Feb 16 13:42:36 compute-1 systemd[213126]: Starting D-Bus User Message Bus Socket...
Feb 16 13:42:36 compute-1 systemd[213126]: Starting Create User's Volatile Files and Directories...
Feb 16 13:42:36 compute-1 systemd[213126]: Finished Create User's Volatile Files and Directories.
Feb 16 13:42:36 compute-1 systemd[213126]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:42:36 compute-1 systemd[213126]: Reached target Sockets.
Feb 16 13:42:36 compute-1 systemd[213126]: Reached target Basic System.
Feb 16 13:42:36 compute-1 systemd[213126]: Reached target Main User Target.
Feb 16 13:42:36 compute-1 systemd[213126]: Startup finished in 139ms.
Feb 16 13:42:36 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:42:36 compute-1 systemd[1]: Started Session 40 of User nova.
Feb 16 13:42:36 compute-1 sshd-session[213122]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:42:36 compute-1 sshd-session[213141]: Received disconnect from 192.168.122.100 port 45560:11: disconnected by user
Feb 16 13:42:36 compute-1 sshd-session[213141]: Disconnected from user nova 192.168.122.100 port 45560
Feb 16 13:42:36 compute-1 sshd-session[213122]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:42:36 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Feb 16 13:42:36 compute-1 systemd-logind[821]: Session 40 logged out. Waiting for processes to exit.
Feb 16 13:42:36 compute-1 systemd-logind[821]: Removed session 40.
Feb 16 13:42:38 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:42:39 compute-1 nova_compute[185910]: 2026-02-16 13:42:39.252 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:39 compute-1 nova_compute[185910]: 2026-02-16 13:42:39.463 185914 DEBUG nova.compute.manager [req-25f17afa-52d0-4479-bc3c-76082fd674c7 req-623332b3-df8b-43af-b129-416dfedc98b4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:39 compute-1 nova_compute[185910]: 2026-02-16 13:42:39.463 185914 DEBUG oslo_concurrency.lockutils [req-25f17afa-52d0-4479-bc3c-76082fd674c7 req-623332b3-df8b-43af-b129-416dfedc98b4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:39 compute-1 nova_compute[185910]: 2026-02-16 13:42:39.464 185914 DEBUG oslo_concurrency.lockutils [req-25f17afa-52d0-4479-bc3c-76082fd674c7 req-623332b3-df8b-43af-b129-416dfedc98b4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:39 compute-1 nova_compute[185910]: 2026-02-16 13:42:39.464 185914 DEBUG oslo_concurrency.lockutils [req-25f17afa-52d0-4479-bc3c-76082fd674c7 req-623332b3-df8b-43af-b129-416dfedc98b4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:39 compute-1 nova_compute[185910]: 2026-02-16 13:42:39.464 185914 DEBUG nova.compute.manager [req-25f17afa-52d0-4479-bc3c-76082fd674c7 req-623332b3-df8b-43af-b129-416dfedc98b4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:42:39 compute-1 nova_compute[185910]: 2026-02-16 13:42:39.465 185914 DEBUG nova.compute.manager [req-25f17afa-52d0-4479-bc3c-76082fd674c7 req-623332b3-df8b-43af-b129-416dfedc98b4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:42:39 compute-1 nova_compute[185910]: 2026-02-16 13:42:39.466 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:39.466 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:42:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:39.467 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:42:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:39.468 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:39 compute-1 sshd-session[213144]: Invalid user test from 146.190.226.24 port 59848
Feb 16 13:42:39 compute-1 sshd-session[213144]: Connection closed by invalid user test 146.190.226.24 port 59848 [preauth]
Feb 16 13:42:40 compute-1 nova_compute[185910]: 2026-02-16 13:42:40.302 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:41 compute-1 nova_compute[185910]: 2026-02-16 13:42:41.711 185914 DEBUG nova.compute.manager [req-708580a1-f984-4fc5-a647-5e78b47e8209 req-667dec37-be74-4b30-b141-89bd5add4480 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:41 compute-1 nova_compute[185910]: 2026-02-16 13:42:41.712 185914 DEBUG oslo_concurrency.lockutils [req-708580a1-f984-4fc5-a647-5e78b47e8209 req-667dec37-be74-4b30-b141-89bd5add4480 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:41 compute-1 nova_compute[185910]: 2026-02-16 13:42:41.712 185914 DEBUG oslo_concurrency.lockutils [req-708580a1-f984-4fc5-a647-5e78b47e8209 req-667dec37-be74-4b30-b141-89bd5add4480 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:41 compute-1 nova_compute[185910]: 2026-02-16 13:42:41.712 185914 DEBUG oslo_concurrency.lockutils [req-708580a1-f984-4fc5-a647-5e78b47e8209 req-667dec37-be74-4b30-b141-89bd5add4480 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:41 compute-1 nova_compute[185910]: 2026-02-16 13:42:41.713 185914 DEBUG nova.compute.manager [req-708580a1-f984-4fc5-a647-5e78b47e8209 req-667dec37-be74-4b30-b141-89bd5add4480 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:42:41 compute-1 nova_compute[185910]: 2026-02-16 13:42:41.713 185914 WARNING nova.compute.manager [req-708580a1-f984-4fc5-a647-5e78b47e8209 req-667dec37-be74-4b30-b141-89bd5add4480 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received unexpected event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with vm_state active and task_state migrating.
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.746 185914 INFO nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Took 8.67 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.747 185914 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.772 185914 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnq0a57mh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ed0f983d-6cd6-429c-8af1-0d52a56731d6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(7685ce54-87f0-4b62-961c-75f6175471f1),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.802 185914 DEBUG nova.objects.instance [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid ed0f983d-6cd6-429c-8af1-0d52a56731d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.803 185914 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.806 185914 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.806 185914 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.826 185914 DEBUG nova.virt.libvirt.vif [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:41:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1167094500',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1167094500',id=18,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:41:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-r5iyo2b2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:41:47Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=ed0f983d-6cd6-429c-8af1-0d52a56731d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.827 185914 DEBUG nova.network.os_vif_util [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.827 185914 DEBUG nova.network.os_vif_util [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.828 185914 DEBUG nova.virt.libvirt.migration [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:42:42 compute-1 nova_compute[185910]:   <mac address="fa:16:3e:b7:0e:aa"/>
Feb 16 13:42:42 compute-1 nova_compute[185910]:   <model type="virtio"/>
Feb 16 13:42:42 compute-1 nova_compute[185910]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:42:42 compute-1 nova_compute[185910]:   <mtu size="1442"/>
Feb 16 13:42:42 compute-1 nova_compute[185910]:   <target dev="tapc9816814-5d"/>
Feb 16 13:42:42 compute-1 nova_compute[185910]: </interface>
Feb 16 13:42:42 compute-1 nova_compute[185910]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:42:42 compute-1 nova_compute[185910]: 2026-02-16 13:42:42.828 185914 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:42:42 compute-1 podman[213146]: 2026-02-16 13:42:42.959228124 +0000 UTC m=+0.097634424 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.310 185914 DEBUG nova.virt.libvirt.migration [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.311 185914 INFO nova.virt.libvirt.migration [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.435 185914 INFO nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.921 185914 DEBUG nova.compute.manager [req-b483c363-a36f-4faf-bca9-8823cc0b5b8b req-a570dce7-cedb-42cd-8d89-a83a52969ef1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-changed-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.922 185914 DEBUG nova.compute.manager [req-b483c363-a36f-4faf-bca9-8823cc0b5b8b req-a570dce7-cedb-42cd-8d89-a83a52969ef1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Refreshing instance network info cache due to event network-changed-c9816814-5dfa-4f80-812c-4fc20a800a47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.922 185914 DEBUG oslo_concurrency.lockutils [req-b483c363-a36f-4faf-bca9-8823cc0b5b8b req-a570dce7-cedb-42cd-8d89-a83a52969ef1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.923 185914 DEBUG oslo_concurrency.lockutils [req-b483c363-a36f-4faf-bca9-8823cc0b5b8b req-a570dce7-cedb-42cd-8d89-a83a52969ef1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.923 185914 DEBUG nova.network.neutron [req-b483c363-a36f-4faf-bca9-8823cc0b5b8b req-a570dce7-cedb-42cd-8d89-a83a52969ef1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Refreshing network info cache for port c9816814-5dfa-4f80-812c-4fc20a800a47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.940 185914 DEBUG nova.virt.libvirt.migration [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:42:43 compute-1 nova_compute[185910]: 2026-02-16 13:42:43.941 185914 DEBUG nova.virt.libvirt.migration [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.254 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.445 185914 DEBUG nova.virt.libvirt.migration [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.446 185914 DEBUG nova.virt.libvirt.migration [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.486 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249364.48586, ed0f983d-6cd6-429c-8af1-0d52a56731d6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.487 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] VM Paused (Lifecycle Event)
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.512 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.519 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.555 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:42:44 compute-1 kernel: tapc9816814-5d (unregistering): left promiscuous mode
Feb 16 13:42:44 compute-1 NetworkManager[56388]: <info>  [1771249364.6124] device (tapc9816814-5d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.620 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:44 compute-1 ovn_controller[96285]: 2026-02-16T13:42:44Z|00143|binding|INFO|Releasing lport c9816814-5dfa-4f80-812c-4fc20a800a47 from this chassis (sb_readonly=0)
Feb 16 13:42:44 compute-1 ovn_controller[96285]: 2026-02-16T13:42:44Z|00144|binding|INFO|Setting lport c9816814-5dfa-4f80-812c-4fc20a800a47 down in Southbound
Feb 16 13:42:44 compute-1 ovn_controller[96285]: 2026-02-16T13:42:44Z|00145|binding|INFO|Removing iface tapc9816814-5d ovn-installed in OVS
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.625 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.636 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.638 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:0e:aa 10.100.0.13'], port_security=['fa:16:3e:b7:0e:aa 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ed0f983d-6cd6-429c-8af1-0d52a56731d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=c9816814-5dfa-4f80-812c-4fc20a800a47) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.640 105573 INFO neutron.agent.ovn.metadata.agent [-] Port c9816814-5dfa-4f80-812c-4fc20a800a47 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.641 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.643 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[445153d0-8cf4-42a8-89d8-5770261f2fed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.644 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:42:44 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Deactivated successfully.
Feb 16 13:42:44 compute-1 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Consumed 15.382s CPU time.
Feb 16 13:42:44 compute-1 systemd-machined[155419]: Machine qemu-12-instance-00000012 terminated.
Feb 16 13:42:44 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212914]: [NOTICE]   (212918) : haproxy version is 2.8.14-c23fe91
Feb 16 13:42:44 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212914]: [NOTICE]   (212918) : path to executable is /usr/sbin/haproxy
Feb 16 13:42:44 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212914]: [WARNING]  (212918) : Exiting Master process...
Feb 16 13:42:44 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212914]: [ALERT]    (212918) : Current worker (212920) exited with code 143 (Terminated)
Feb 16 13:42:44 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[212914]: [WARNING]  (212918) : All workers exited. Exiting... (0)
Feb 16 13:42:44 compute-1 systemd[1]: libpod-8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c.scope: Deactivated successfully.
Feb 16 13:42:44 compute-1 podman[213212]: 2026-02-16 13:42:44.765612877 +0000 UTC m=+0.047866278 container died 8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:42:44 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c-userdata-shm.mount: Deactivated successfully.
Feb 16 13:42:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-3357504af40a718f79764d2078e67f79fb874b2a624e00025dfc97f1c3a2059f-merged.mount: Deactivated successfully.
Feb 16 13:42:44 compute-1 podman[213212]: 2026-02-16 13:42:44.799138798 +0000 UTC m=+0.081392189 container cleanup 8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:42:44 compute-1 systemd[1]: libpod-conmon-8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c.scope: Deactivated successfully.
Feb 16 13:42:44 compute-1 podman[213247]: 2026-02-16 13:42:44.856186121 +0000 UTC m=+0.040702425 container remove 8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.863 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[84ad17d0-a377-4f99-b6be-3d833d39829b]: (4, ('Mon Feb 16 01:42:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c)\n8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c\nMon Feb 16 01:42:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c)\n8a33e8227bbc8d34e972df32b3c25773972511901ded22669815768ced6eb44c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.866 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[4b29159e-cb29-45ac-a425-9ca8df4a2c3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.867 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.869 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:44 compute-1 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.875 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.876 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.878 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[253984b2-a794-4e5f-9013-316d19e717af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.892 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3d7a0f-3d79-46e5-854a-df21ec7e9c18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.894 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[cef580da-dec3-4bbc-acb0-c228b1a2eca0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.905 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5d577d-24fe-4498-b036-3ff7880410f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539043, 'reachable_time': 28790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213276, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.908 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:42:44 compute-1 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:42:44 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:42:44.908 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[329bba3e-a3df-4de5-acc3-9c7e4d3e3ffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.941 185914 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.942 185914 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.943 185914 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.948 185914 DEBUG nova.virt.libvirt.guest [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'ed0f983d-6cd6-429c-8af1-0d52a56731d6' (instance-00000012) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.948 185914 INFO nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Migration operation has completed
Feb 16 13:42:44 compute-1 nova_compute[185910]: 2026-02-16 13:42:44.948 185914 INFO nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] _post_live_migration() is started..
Feb 16 13:42:45 compute-1 nova_compute[185910]: 2026-02-16 13:42:45.304 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:46 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:42:46 compute-1 systemd[213126]: Activating special unit Exit the Session...
Feb 16 13:42:46 compute-1 systemd[213126]: Stopped target Main User Target.
Feb 16 13:42:46 compute-1 systemd[213126]: Stopped target Basic System.
Feb 16 13:42:46 compute-1 systemd[213126]: Stopped target Paths.
Feb 16 13:42:46 compute-1 systemd[213126]: Stopped target Sockets.
Feb 16 13:42:46 compute-1 systemd[213126]: Stopped target Timers.
Feb 16 13:42:46 compute-1 systemd[213126]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:42:46 compute-1 systemd[213126]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:42:46 compute-1 systemd[213126]: Closed D-Bus User Message Bus Socket.
Feb 16 13:42:46 compute-1 systemd[213126]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:42:46 compute-1 systemd[213126]: Removed slice User Application Slice.
Feb 16 13:42:46 compute-1 systemd[213126]: Reached target Shutdown.
Feb 16 13:42:46 compute-1 systemd[213126]: Finished Exit the Session.
Feb 16 13:42:46 compute-1 systemd[213126]: Reached target Exit the Session.
Feb 16 13:42:46 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:42:46 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:42:46 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:42:46 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:42:46 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:42:46 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:42:46 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.354 185914 DEBUG nova.compute.manager [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.355 185914 DEBUG oslo_concurrency.lockutils [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.355 185914 DEBUG oslo_concurrency.lockutils [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.356 185914 DEBUG oslo_concurrency.lockutils [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.356 185914 DEBUG nova.compute.manager [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.357 185914 DEBUG nova.compute.manager [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.357 185914 DEBUG nova.compute.manager [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.357 185914 DEBUG oslo_concurrency.lockutils [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.358 185914 DEBUG oslo_concurrency.lockutils [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.358 185914 DEBUG oslo_concurrency.lockutils [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.358 185914 DEBUG nova.compute.manager [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.359 185914 WARNING nova.compute.manager [req-3d1a2989-ec1a-4c07-b5f9-61ef6db59fb9 req-b03330f7-0158-4754-883b-d6b76550df74 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received unexpected event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with vm_state active and task_state migrating.
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.449 185914 DEBUG nova.network.neutron [req-b483c363-a36f-4faf-bca9-8823cc0b5b8b req-a570dce7-cedb-42cd-8d89-a83a52969ef1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updated VIF entry in instance network info cache for port c9816814-5dfa-4f80-812c-4fc20a800a47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.450 185914 DEBUG nova.network.neutron [req-b483c363-a36f-4faf-bca9-8823cc0b5b8b req-a570dce7-cedb-42cd-8d89-a83a52969ef1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Updating instance_info_cache with network_info: [{"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.487 185914 DEBUG oslo_concurrency.lockutils [req-b483c363-a36f-4faf-bca9-8823cc0b5b8b req-a570dce7-cedb-42cd-8d89-a83a52969ef1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-ed0f983d-6cd6-429c-8af1-0d52a56731d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.755 185914 DEBUG nova.network.neutron [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port c9816814-5dfa-4f80-812c-4fc20a800a47 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.755 185914 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.756 185914 DEBUG nova.virt.libvirt.vif [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:41:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1167094500',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1167094500',id=18,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:41:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-r5iyo2b2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:42:30Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=ed0f983d-6cd6-429c-8af1-0d52a56731d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.756 185914 DEBUG nova.network.os_vif_util [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "c9816814-5dfa-4f80-812c-4fc20a800a47", "address": "fa:16:3e:b7:0e:aa", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9816814-5d", "ovs_interfaceid": "c9816814-5dfa-4f80-812c-4fc20a800a47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.757 185914 DEBUG nova.network.os_vif_util [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.757 185914 DEBUG os_vif [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.759 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.759 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9816814-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.760 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.762 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.765 185914 INFO os_vif [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:0e:aa,bridge_name='br-int',has_traffic_filtering=True,id=c9816814-5dfa-4f80-812c-4fc20a800a47,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9816814-5d')
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.765 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.765 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.766 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.766 185914 DEBUG nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.766 185914 INFO nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Deleting instance files /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6_del
Feb 16 13:42:47 compute-1 nova_compute[185910]: 2026-02-16 13:42:47.767 185914 INFO nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Deletion of /var/lib/nova/instances/ed0f983d-6cd6-429c-8af1-0d52a56731d6_del complete
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.255 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:49 compute-1 openstack_network_exporter[198096]: ERROR   13:42:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:42:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:42:49 compute-1 openstack_network_exporter[198096]: ERROR   13:42:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:42:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.461 185914 DEBUG nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.462 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.462 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.462 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.463 185914 DEBUG nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.463 185914 DEBUG nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-unplugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.463 185914 DEBUG nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.463 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.463 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.464 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.464 185914 DEBUG nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.464 185914 WARNING nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received unexpected event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with vm_state active and task_state migrating.
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.464 185914 DEBUG nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.464 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.465 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.465 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.465 185914 DEBUG nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.465 185914 WARNING nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received unexpected event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with vm_state active and task_state migrating.
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.465 185914 DEBUG nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.465 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.466 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.466 185914 DEBUG oslo_concurrency.lockutils [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.466 185914 DEBUG nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] No waiting events found dispatching network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:42:49 compute-1 nova_compute[185910]: 2026-02-16 13:42:49.466 185914 WARNING nova.compute.manager [req-48b396ff-fbe0-4c9b-b3b5-01b8947823dd req-81f5dc97-dde0-4cd4-bc41-1241060479a6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Received unexpected event network-vif-plugged-c9816814-5dfa-4f80-812c-4fc20a800a47 for instance with vm_state active and task_state migrating.
Feb 16 13:42:52 compute-1 nova_compute[185910]: 2026-02-16 13:42:52.762 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:54 compute-1 nova_compute[185910]: 2026-02-16 13:42:54.257 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.508 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.508 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.508 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "ed0f983d-6cd6-429c-8af1-0d52a56731d6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.536 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.536 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.536 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.537 185914 DEBUG nova.compute.resource_tracker [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.704 185914 WARNING nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.706 185914 DEBUG nova.compute.resource_tracker [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5787MB free_disk=73.22362899780273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.706 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.706 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.757 185914 DEBUG nova.compute.resource_tracker [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance ed0f983d-6cd6-429c-8af1-0d52a56731d6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.779 185914 DEBUG nova.compute.resource_tracker [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.833 185914 DEBUG nova.compute.resource_tracker [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration 7685ce54-87f0-4b62-961c-75f6175471f1 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.833 185914 DEBUG nova.compute.resource_tracker [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.834 185914 DEBUG nova.compute.resource_tracker [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.877 185914 DEBUG nova.compute.provider_tree [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.892 185914 DEBUG nova.scheduler.client.report [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.915 185914 DEBUG nova.compute.resource_tracker [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.916 185914 DEBUG oslo_concurrency.lockutils [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:42:55 compute-1 nova_compute[185910]: 2026-02-16 13:42:55.926 185914 INFO nova.compute.manager [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Feb 16 13:42:56 compute-1 nova_compute[185910]: 2026-02-16 13:42:56.015 185914 INFO nova.scheduler.client.report [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 7685ce54-87f0-4b62-961c-75f6175471f1
Feb 16 13:42:56 compute-1 nova_compute[185910]: 2026-02-16 13:42:56.015 185914 DEBUG nova.virt.libvirt.driver [None req-d80a3cf2-6a33-46be-b9cd-5e124e82152d bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:42:57 compute-1 nova_compute[185910]: 2026-02-16 13:42:57.817 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:57 compute-1 podman[213282]: 2026-02-16 13:42:57.921957639 +0000 UTC m=+0.059454998 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent)
Feb 16 13:42:57 compute-1 podman[213281]: 2026-02-16 13:42:57.947938078 +0000 UTC m=+0.085842808 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:42:58 compute-1 sshd-session[213317]: Invalid user postgres from 188.166.42.159 port 53392
Feb 16 13:42:59 compute-1 nova_compute[185910]: 2026-02-16 13:42:59.260 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:42:59 compute-1 sshd-session[213317]: Connection closed by invalid user postgres 188.166.42.159 port 53392 [preauth]
Feb 16 13:42:59 compute-1 nova_compute[185910]: 2026-02-16 13:42:59.841 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249364.8346782, ed0f983d-6cd6-429c-8af1-0d52a56731d6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:42:59 compute-1 nova_compute[185910]: 2026-02-16 13:42:59.842 185914 INFO nova.compute.manager [-] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] VM Stopped (Lifecycle Event)
Feb 16 13:42:59 compute-1 nova_compute[185910]: 2026-02-16 13:42:59.874 185914 DEBUG nova.compute.manager [None req-45cadf3a-0319-49f2-859d-c3eef9b405b1 - - - - - -] [instance: ed0f983d-6cd6-429c-8af1-0d52a56731d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:02 compute-1 nova_compute[185910]: 2026-02-16 13:43:02.853 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:03.353 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:03.354 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:03.354 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:04 compute-1 nova_compute[185910]: 2026-02-16 13:43:04.262 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:04 compute-1 podman[213319]: 2026-02-16 13:43:04.937940671 +0000 UTC m=+0.068982065 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:43:05 compute-1 podman[195236]: time="2026-02-16T13:43:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:43:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:43:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:43:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:43:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 13:43:07 compute-1 nova_compute[185910]: 2026-02-16 13:43:07.909 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:09 compute-1 nova_compute[185910]: 2026-02-16 13:43:09.263 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:12 compute-1 nova_compute[185910]: 2026-02-16 13:43:12.959 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:13 compute-1 podman[213348]: 2026-02-16 13:43:13.921315642 +0000 UTC m=+0.059345816 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:43:14 compute-1 nova_compute[185910]: 2026-02-16 13:43:14.266 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:17 compute-1 nova_compute[185910]: 2026-02-16 13:43:17.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:17 compute-1 nova_compute[185910]: 2026-02-16 13:43:17.993 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:18 compute-1 nova_compute[185910]: 2026-02-16 13:43:18.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:19 compute-1 nova_compute[185910]: 2026-02-16 13:43:19.267 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:19 compute-1 openstack_network_exporter[198096]: ERROR   13:43:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:43:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:43:19 compute-1 openstack_network_exporter[198096]: ERROR   13:43:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:43:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:43:19 compute-1 nova_compute[185910]: 2026-02-16 13:43:19.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.501 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.501 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.523 185914 DEBUG nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.620 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.621 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.638 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.639 185914 INFO nova.compute.claims [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.685 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.801 185914 DEBUG nova.compute.provider_tree [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.842 185914 DEBUG nova.scheduler.client.report [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.869 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.871 185914 DEBUG nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.875 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.875 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.876 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.935 185914 DEBUG nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.936 185914 DEBUG nova.network.neutron [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.958 185914 INFO nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.979 185914 DEBUG nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:43:22 compute-1 nova_compute[185910]: 2026-02-16 13:43:22.996 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.071 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.072 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5800MB free_disk=73.22362899780273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.073 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.073 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.103 185914 DEBUG nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.106 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.106 185914 INFO nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Creating image(s)
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.107 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.107 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.108 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.131 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.179 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 5a1cf877-f781-4088-8f98-19d39a95d5bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.179 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.180 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.197 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.199 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.199 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.209 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.253 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.269 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.276 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.277 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.298 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.299 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.319 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.320 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.320 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.372 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.374 185914 DEBUG nova.virt.disk.api [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.374 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.427 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.428 185914 DEBUG nova.virt.disk.api [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.429 185914 DEBUG nova.objects.instance [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 5a1cf877-f781-4088-8f98-19d39a95d5bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.454 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.455 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Ensure instance console log exists: /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.456 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.456 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.457 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:23 compute-1 nova_compute[185910]: 2026-02-16 13:43:23.627 185914 DEBUG nova.policy [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:43:24 compute-1 nova_compute[185910]: 2026-02-16 13:43:24.278 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:24 compute-1 nova_compute[185910]: 2026-02-16 13:43:24.298 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:24 compute-1 nova_compute[185910]: 2026-02-16 13:43:24.790 185914 DEBUG nova.network.neutron [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Successfully created port: b5736eee-a7c7-4376-87f8-2ba8e852813f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:43:25 compute-1 nova_compute[185910]: 2026-02-16 13:43:25.606 185914 DEBUG nova.network.neutron [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Successfully updated port: b5736eee-a7c7-4376-87f8-2ba8e852813f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:43:25 compute-1 nova_compute[185910]: 2026-02-16 13:43:25.622 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:43:25 compute-1 nova_compute[185910]: 2026-02-16 13:43:25.622 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:43:25 compute-1 nova_compute[185910]: 2026-02-16 13:43:25.622 185914 DEBUG nova.network.neutron [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:43:25 compute-1 nova_compute[185910]: 2026-02-16 13:43:25.716 185914 DEBUG nova.compute.manager [req-14271a86-21a2-4dae-9d98-d61630836c0c req-1fb1c445-d888-4f9a-a1d3-b3148067d393 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-changed-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:25 compute-1 nova_compute[185910]: 2026-02-16 13:43:25.717 185914 DEBUG nova.compute.manager [req-14271a86-21a2-4dae-9d98-d61630836c0c req-1fb1c445-d888-4f9a-a1d3-b3148067d393 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Refreshing instance network info cache due to event network-changed-b5736eee-a7c7-4376-87f8-2ba8e852813f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:43:25 compute-1 nova_compute[185910]: 2026-02-16 13:43:25.718 185914 DEBUG oslo_concurrency.lockutils [req-14271a86-21a2-4dae-9d98-d61630836c0c req-1fb1c445-d888-4f9a-a1d3-b3148067d393 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:43:25 compute-1 nova_compute[185910]: 2026-02-16 13:43:25.793 185914 DEBUG nova.network.neutron [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:43:26 compute-1 nova_compute[185910]: 2026-02-16 13:43:26.626 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:26 compute-1 nova_compute[185910]: 2026-02-16 13:43:26.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:26 compute-1 nova_compute[185910]: 2026-02-16 13:43:26.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:43:26 compute-1 nova_compute[185910]: 2026-02-16 13:43:26.661 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:43:26 compute-1 nova_compute[185910]: 2026-02-16 13:43:26.981 185914 DEBUG nova.network.neutron [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updating instance_info_cache with network_info: [{"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.006 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.006 185914 DEBUG nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Instance network_info: |[{"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.007 185914 DEBUG oslo_concurrency.lockutils [req-14271a86-21a2-4dae-9d98-d61630836c0c req-1fb1c445-d888-4f9a-a1d3-b3148067d393 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.007 185914 DEBUG nova.network.neutron [req-14271a86-21a2-4dae-9d98-d61630836c0c req-1fb1c445-d888-4f9a-a1d3-b3148067d393 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Refreshing network info cache for port b5736eee-a7c7-4376-87f8-2ba8e852813f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.010 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Start _get_guest_xml network_info=[{"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.014 185914 WARNING nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.019 185914 DEBUG nova.virt.libvirt.host [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.021 185914 DEBUG nova.virt.libvirt.host [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.029 185914 DEBUG nova.virt.libvirt.host [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.030 185914 DEBUG nova.virt.libvirt.host [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.032 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.032 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.032 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.032 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.033 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.033 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.033 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.033 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.033 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.034 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.034 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.034 185914 DEBUG nova.virt.hardware [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.037 185914 DEBUG nova.virt.libvirt.vif [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:43:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-444598461',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-444598461',id=19,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-dq2i0im0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:43:23Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=5a1cf877-f781-4088-8f98-19d39a95d5bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.038 185914 DEBUG nova.network.os_vif_util [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.038 185914 DEBUG nova.network.os_vif_util [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.039 185914 DEBUG nova.objects.instance [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5a1cf877-f781-4088-8f98-19d39a95d5bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.053 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <uuid>5a1cf877-f781-4088-8f98-19d39a95d5bc</uuid>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <name>instance-00000013</name>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteStrategies-server-444598461</nova:name>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:43:27</nova:creationTime>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:43:27 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:43:27 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:43:27 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:43:27 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:43:27 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:43:27 compute-1 nova_compute[185910]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:43:27 compute-1 nova_compute[185910]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:43:27 compute-1 nova_compute[185910]:         <nova:port uuid="b5736eee-a7c7-4376-87f8-2ba8e852813f">
Feb 16 13:43:27 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <system>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <entry name="serial">5a1cf877-f781-4088-8f98-19d39a95d5bc</entry>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <entry name="uuid">5a1cf877-f781-4088-8f98-19d39a95d5bc</entry>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     </system>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <os>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   </os>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <features>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   </features>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:e4:03:04"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <target dev="tapb5736eee-a7"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/console.log" append="off"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <video>
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     </video>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:43:27 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:43:27 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:43:27 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:43:27 compute-1 nova_compute[185910]: </domain>
Feb 16 13:43:27 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.054 185914 DEBUG nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Preparing to wait for external event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.054 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.055 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.055 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.055 185914 DEBUG nova.virt.libvirt.vif [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:43:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-444598461',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-444598461',id=19,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-dq2i0im0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:43:23Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=5a1cf877-f781-4088-8f98-19d39a95d5bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.055 185914 DEBUG nova.network.os_vif_util [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.056 185914 DEBUG nova.network.os_vif_util [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.056 185914 DEBUG os_vif [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.057 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.057 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.057 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.061 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.061 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5736eee-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.061 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5736eee-a7, col_values=(('external_ids', {'iface-id': 'b5736eee-a7c7-4376-87f8-2ba8e852813f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:03:04', 'vm-uuid': '5a1cf877-f781-4088-8f98-19d39a95d5bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.063 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:27 compute-1 NetworkManager[56388]: <info>  [1771249407.0649] manager: (tapb5736eee-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.066 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.070 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.071 185914 INFO os_vif [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7')
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.116 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.117 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.117 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:e4:03:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.117 185914 INFO nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Using config drive
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.591 185914 INFO nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Creating config drive at /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.594 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp5cl48ncv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.713 185914 DEBUG oslo_concurrency.processutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp5cl48ncv" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:43:27 compute-1 kernel: tapb5736eee-a7: entered promiscuous mode
Feb 16 13:43:27 compute-1 NetworkManager[56388]: <info>  [1771249407.7774] manager: (tapb5736eee-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Feb 16 13:43:27 compute-1 systemd-udevd[213407]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:43:27 compute-1 ovn_controller[96285]: 2026-02-16T13:43:27Z|00146|binding|INFO|Claiming lport b5736eee-a7c7-4376-87f8-2ba8e852813f for this chassis.
Feb 16 13:43:27 compute-1 ovn_controller[96285]: 2026-02-16T13:43:27Z|00147|binding|INFO|b5736eee-a7c7-4376-87f8-2ba8e852813f: Claiming fa:16:3e:e4:03:04 10.100.0.11
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.816 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:27 compute-1 ovn_controller[96285]: 2026-02-16T13:43:27Z|00148|binding|INFO|Setting lport b5736eee-a7c7-4376-87f8-2ba8e852813f ovn-installed in OVS
Feb 16 13:43:27 compute-1 nova_compute[185910]: 2026-02-16 13:43:27.822 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:27 compute-1 ovn_controller[96285]: 2026-02-16T13:43:27Z|00149|binding|INFO|Setting lport b5736eee-a7c7-4376-87f8-2ba8e852813f up in Southbound
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.824 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:03:04 10.100.0.11'], port_security=['fa:16:3e:e4:03:04 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5a1cf877-f781-4088-8f98-19d39a95d5bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=b5736eee-a7c7-4376-87f8-2ba8e852813f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.825 105573 INFO neutron.agent.ovn.metadata.agent [-] Port b5736eee-a7c7-4376-87f8-2ba8e852813f in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.826 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:43:27 compute-1 NetworkManager[56388]: <info>  [1771249407.8279] device (tapb5736eee-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:43:27 compute-1 NetworkManager[56388]: <info>  [1771249407.8284] device (tapb5736eee-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.836 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[886fa9da-448a-441e-82e4-d95d98332709]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.837 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.840 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.840 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[047667e7-256e-4a6a-bb2f-a40cf3c232c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.842 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[110c1d89-e4c2-40d6-aa06-5d2072385dbf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 systemd-machined[155419]: New machine qemu-13-instance-00000013.
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.851 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[c279d93b-4f1d-4c3e-adf6-96e214bdbd1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 systemd[1]: Started Virtual Machine qemu-13-instance-00000013.
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.863 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ea5e2d-059e-4c44-9dc0-6627290c3702]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.895 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[f07ea857-de79-48a2-83fe-d7c11df856c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 NetworkManager[56388]: <info>  [1771249407.9039] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Feb 16 13:43:27 compute-1 systemd-udevd[213411]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.903 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f922cad4-65a2-41ad-9888-59ac1657539c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.936 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[d712edcc-5fb5-4554-b923-9841e809aa86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.942 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[d1877119-34e4-4483-982b-e919517f49d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 NetworkManager[56388]: <info>  [1771249407.9654] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.971 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[86dea6a3-780f-4349-b33a-5379189dd55f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:27.986 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf8aefc-e096-4c13-9124-31e9c512af66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549238, 'reachable_time': 27547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213443, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.003 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[dd78ac93-1a7a-4f86-af11-60939e800844]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549238, 'tstamp': 549238}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213445, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:28 compute-1 sshd-session[213391]: Invalid user eigenlayer from 2.57.122.210 port 37916
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.024 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e9b22ca8-e833-4cd0-a9ad-fc58572c49d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549238, 'reachable_time': 27547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213447, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.055 185914 DEBUG nova.compute.manager [req-96acaecd-7d8d-4b0f-b70b-c6ce19ca732a req-2b1cb471-8861-4060-99bf-4243c13bd27a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.056 185914 DEBUG oslo_concurrency.lockutils [req-96acaecd-7d8d-4b0f-b70b-c6ce19ca732a req-2b1cb471-8861-4060-99bf-4243c13bd27a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.056 185914 DEBUG oslo_concurrency.lockutils [req-96acaecd-7d8d-4b0f-b70b-c6ce19ca732a req-2b1cb471-8861-4060-99bf-4243c13bd27a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.057 185914 DEBUG oslo_concurrency.lockutils [req-96acaecd-7d8d-4b0f-b70b-c6ce19ca732a req-2b1cb471-8861-4060-99bf-4243c13bd27a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.059 185914 DEBUG nova.compute.manager [req-96acaecd-7d8d-4b0f-b70b-c6ce19ca732a req-2b1cb471-8861-4060-99bf-4243c13bd27a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Processing event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.066 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[74e7707a-fd7e-4c9b-a09a-941e6db5e838]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:28 compute-1 podman[213444]: 2026-02-16 13:43:28.106862121 +0000 UTC m=+0.087988856 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9/ubi-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.119 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a51f9ded-cac6-46c6-a182-31620b843fc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:28 compute-1 podman[213446]: 2026-02-16 13:43:28.120739904 +0000 UTC m=+0.101372085 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.121 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.121 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.122 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:28 compute-1 NetworkManager[56388]: <info>  [1771249408.1245] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Feb 16 13:43:28 compute-1 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.123 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.127 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.129 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:28 compute-1 ovn_controller[96285]: 2026-02-16T13:43:28Z|00150|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.130 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.131 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.132 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.135 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[2f4bbe45-1361-4a21-91de-4fcecab8db6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.135 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.136 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:43:28 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:28.137 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.311 185914 DEBUG nova.network.neutron [req-14271a86-21a2-4dae-9d98-d61630836c0c req-1fb1c445-d888-4f9a-a1d3-b3148067d393 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updated VIF entry in instance network info cache for port b5736eee-a7c7-4376-87f8-2ba8e852813f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.312 185914 DEBUG nova.network.neutron [req-14271a86-21a2-4dae-9d98-d61630836c0c req-1fb1c445-d888-4f9a-a1d3-b3148067d393 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updating instance_info_cache with network_info: [{"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:43:28 compute-1 sshd-session[213391]: Connection closed by invalid user eigenlayer 2.57.122.210 port 37916 [preauth]
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.330 185914 DEBUG oslo_concurrency.lockutils [req-14271a86-21a2-4dae-9d98-d61630836c0c req-1fb1c445-d888-4f9a-a1d3-b3148067d393 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:43:28 compute-1 podman[213523]: 2026-02-16 13:43:28.493315626 +0000 UTC m=+0.057333521 container create a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.516 185914 DEBUG nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.518 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249408.517477, 5a1cf877-f781-4088-8f98-19d39a95d5bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.518 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] VM Started (Lifecycle Event)
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.522 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.528 185914 INFO nova.virt.libvirt.driver [-] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Instance spawned successfully.
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.528 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:43:28 compute-1 systemd[1]: Started libpod-conmon-a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b.scope.
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.549 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:28 compute-1 podman[213523]: 2026-02-16 13:43:28.460309549 +0000 UTC m=+0.024327564 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.559 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:43:28 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.563 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.564 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.564 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.565 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.565 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.566 185914 DEBUG nova.virt.libvirt.driver [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:43:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1bf39e1145315fdb3abb059ca342acd7d2c0b6d8ec7141c06c24ecc0f1acd5fe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:43:28 compute-1 podman[213523]: 2026-02-16 13:43:28.580836808 +0000 UTC m=+0.144854763 container init a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 13:43:28 compute-1 podman[213523]: 2026-02-16 13:43:28.585212166 +0000 UTC m=+0.149230081 container start a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.598 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.598 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249408.518452, 5a1cf877-f781-4088-8f98-19d39a95d5bc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.598 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] VM Paused (Lifecycle Event)
Feb 16 13:43:28 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213539]: [NOTICE]   (213543) : New worker (213545) forked
Feb 16 13:43:28 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213539]: [NOTICE]   (213543) : Loading success.
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.656 185914 INFO nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Took 5.55 seconds to spawn the instance on the hypervisor.
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.656 185914 DEBUG nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.657 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.661 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.661 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.661 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.668 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249408.5218492, 5a1cf877-f781-4088-8f98-19d39a95d5bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.668 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] VM Resumed (Lifecycle Event)
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.697 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.697 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.697 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.704 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.708 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.731 185914 INFO nova.compute.manager [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Took 6.15 seconds to build instance.
Feb 16 13:43:28 compute-1 nova_compute[185910]: 2026-02-16 13:43:28.749 185914 DEBUG oslo_concurrency.lockutils [None req-98a30a2f-4ba1-43a7-ab3d-4129dc55f6cf e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:29 compute-1 nova_compute[185910]: 2026-02-16 13:43:29.281 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:30 compute-1 nova_compute[185910]: 2026-02-16 13:43:30.167 185914 DEBUG nova.compute.manager [req-a2ff29ea-76e3-4111-a8d7-2f2f03dbf3bb req-af22c873-5f30-4526-94c8-7c755750957a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:43:30 compute-1 nova_compute[185910]: 2026-02-16 13:43:30.168 185914 DEBUG oslo_concurrency.lockutils [req-a2ff29ea-76e3-4111-a8d7-2f2f03dbf3bb req-af22c873-5f30-4526-94c8-7c755750957a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:43:30 compute-1 nova_compute[185910]: 2026-02-16 13:43:30.168 185914 DEBUG oslo_concurrency.lockutils [req-a2ff29ea-76e3-4111-a8d7-2f2f03dbf3bb req-af22c873-5f30-4526-94c8-7c755750957a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:43:30 compute-1 nova_compute[185910]: 2026-02-16 13:43:30.169 185914 DEBUG oslo_concurrency.lockutils [req-a2ff29ea-76e3-4111-a8d7-2f2f03dbf3bb req-af22c873-5f30-4526-94c8-7c755750957a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:43:30 compute-1 nova_compute[185910]: 2026-02-16 13:43:30.169 185914 DEBUG nova.compute.manager [req-a2ff29ea-76e3-4111-a8d7-2f2f03dbf3bb req-af22c873-5f30-4526-94c8-7c755750957a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:43:30 compute-1 nova_compute[185910]: 2026-02-16 13:43:30.169 185914 WARNING nova.compute.manager [req-a2ff29ea-76e3-4111-a8d7-2f2f03dbf3bb req-af22c873-5f30-4526-94c8-7c755750957a faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received unexpected event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with vm_state active and task_state None.
Feb 16 13:43:30 compute-1 nova_compute[185910]: 2026-02-16 13:43:30.647 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:30 compute-1 nova_compute[185910]: 2026-02-16 13:43:30.648 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:43:32 compute-1 nova_compute[185910]: 2026-02-16 13:43:32.063 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:34 compute-1 nova_compute[185910]: 2026-02-16 13:43:34.283 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:34 compute-1 nova_compute[185910]: 2026-02-16 13:43:34.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:34 compute-1 nova_compute[185910]: 2026-02-16 13:43:34.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:43:35 compute-1 podman[195236]: time="2026-02-16T13:43:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:43:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:43:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:43:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:43:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 16 13:43:35 compute-1 podman[213554]: 2026-02-16 13:43:35.947203067 +0000 UTC m=+0.085684063 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:43:36 compute-1 nova_compute[185910]: 2026-02-16 13:43:36.937 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:43:37 compute-1 nova_compute[185910]: 2026-02-16 13:43:37.065 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:39 compute-1 nova_compute[185910]: 2026-02-16 13:43:39.283 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:39.620 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:43:39 compute-1 nova_compute[185910]: 2026-02-16 13:43:39.620 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:39.621 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:43:40 compute-1 ovn_controller[96285]: 2026-02-16T13:43:40Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e4:03:04 10.100.0.11
Feb 16 13:43:40 compute-1 ovn_controller[96285]: 2026-02-16T13:43:40Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e4:03:04 10.100.0.11
Feb 16 13:43:42 compute-1 nova_compute[185910]: 2026-02-16 13:43:42.066 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:44 compute-1 nova_compute[185910]: 2026-02-16 13:43:44.285 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:44 compute-1 sshd-session[213592]: Invalid user test from 146.190.226.24 port 41864
Feb 16 13:43:44 compute-1 podman[213594]: 2026-02-16 13:43:44.52273693 +0000 UTC m=+0.059906311 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:43:44 compute-1 sshd-session[213592]: Connection closed by invalid user test 146.190.226.24 port 41864 [preauth]
Feb 16 13:43:45 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:43:45.625 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:43:47 compute-1 nova_compute[185910]: 2026-02-16 13:43:47.069 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:49 compute-1 nova_compute[185910]: 2026-02-16 13:43:49.287 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:49 compute-1 openstack_network_exporter[198096]: ERROR   13:43:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:43:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:43:49 compute-1 openstack_network_exporter[198096]: ERROR   13:43:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:43:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:43:49 compute-1 sshd-session[213619]: Invalid user postgres from 188.166.42.159 port 47426
Feb 16 13:43:49 compute-1 sshd-session[213619]: Connection closed by invalid user postgres 188.166.42.159 port 47426 [preauth]
Feb 16 13:43:52 compute-1 nova_compute[185910]: 2026-02-16 13:43:52.071 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:54 compute-1 nova_compute[185910]: 2026-02-16 13:43:54.289 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:57 compute-1 nova_compute[185910]: 2026-02-16 13:43:57.073 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:43:58 compute-1 podman[213622]: 2026-02-16 13:43:58.910906796 +0000 UTC m=+0.040998973 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 16 13:43:58 compute-1 podman[213621]: 2026-02-16 13:43:58.913255609 +0000 UTC m=+0.050945410 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, release=1770267347, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Feb 16 13:43:59 compute-1 nova_compute[185910]: 2026-02-16 13:43:59.291 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:02 compute-1 nova_compute[185910]: 2026-02-16 13:44:02.075 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:02 compute-1 nova_compute[185910]: 2026-02-16 13:44:02.919 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:02 compute-1 nova_compute[185910]: 2026-02-16 13:44:02.946 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Triggering sync for uuid 5a1cf877-f781-4088-8f98-19d39a95d5bc _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 13:44:02 compute-1 nova_compute[185910]: 2026-02-16 13:44:02.946 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:02 compute-1 nova_compute[185910]: 2026-02-16 13:44:02.947 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:02 compute-1 nova_compute[185910]: 2026-02-16 13:44:02.970 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:03.354 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:03.354 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:03.355 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:04 compute-1 nova_compute[185910]: 2026-02-16 13:44:04.323 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:05 compute-1 podman[195236]: time="2026-02-16T13:44:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:44:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:44:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:44:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:44:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Feb 16 13:44:06 compute-1 podman[213659]: 2026-02-16 13:44:06.951278705 +0000 UTC m=+0.089776294 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:44:07 compute-1 nova_compute[185910]: 2026-02-16 13:44:07.077 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:09 compute-1 ovn_controller[96285]: 2026-02-16T13:44:09Z|00151|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Feb 16 13:44:09 compute-1 nova_compute[185910]: 2026-02-16 13:44:09.372 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:12 compute-1 nova_compute[185910]: 2026-02-16 13:44:12.079 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:14 compute-1 nova_compute[185910]: 2026-02-16 13:44:14.375 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:14 compute-1 podman[213686]: 2026-02-16 13:44:14.906302181 +0000 UTC m=+0.049203864 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:44:17 compute-1 nova_compute[185910]: 2026-02-16 13:44:17.082 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:18 compute-1 nova_compute[185910]: 2026-02-16 13:44:18.660 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:18 compute-1 nova_compute[185910]: 2026-02-16 13:44:18.661 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:19 compute-1 nova_compute[185910]: 2026-02-16 13:44:19.414 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:19 compute-1 openstack_network_exporter[198096]: ERROR   13:44:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:44:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:44:19 compute-1 openstack_network_exporter[198096]: ERROR   13:44:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:44:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:44:20 compute-1 nova_compute[185910]: 2026-02-16 13:44:20.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:22 compute-1 nova_compute[185910]: 2026-02-16 13:44:22.084 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:22 compute-1 nova_compute[185910]: 2026-02-16 13:44:22.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:23 compute-1 nova_compute[185910]: 2026-02-16 13:44:23.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:23 compute-1 nova_compute[185910]: 2026-02-16 13:44:23.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:23 compute-1 nova_compute[185910]: 2026-02-16 13:44:23.762 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:23 compute-1 nova_compute[185910]: 2026-02-16 13:44:23.763 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:23 compute-1 nova_compute[185910]: 2026-02-16 13:44:23.763 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:23 compute-1 nova_compute[185910]: 2026-02-16 13:44:23.763 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:44:23 compute-1 nova_compute[185910]: 2026-02-16 13:44:23.884 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:23 compute-1 nova_compute[185910]: 2026-02-16 13:44:23.935 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:23 compute-1 nova_compute[185910]: 2026-02-16 13:44:23.936 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.002 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.154 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.156 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5652MB free_disk=73.19443893432617GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.156 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.157 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.429 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.447 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 5a1cf877-f781-4088-8f98-19d39a95d5bc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.448 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.448 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.827 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.861 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.895 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:44:24 compute-1 nova_compute[185910]: 2026-02-16 13:44:24.896 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:27 compute-1 nova_compute[185910]: 2026-02-16 13:44:27.087 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:27 compute-1 nova_compute[185910]: 2026-02-16 13:44:27.890 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:28 compute-1 nova_compute[185910]: 2026-02-16 13:44:28.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:28 compute-1 nova_compute[185910]: 2026-02-16 13:44:28.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:44:28 compute-1 nova_compute[185910]: 2026-02-16 13:44:28.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:44:29 compute-1 nova_compute[185910]: 2026-02-16 13:44:29.431 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:29 compute-1 nova_compute[185910]: 2026-02-16 13:44:29.705 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:44:29 compute-1 nova_compute[185910]: 2026-02-16 13:44:29.705 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:44:29 compute-1 nova_compute[185910]: 2026-02-16 13:44:29.706 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:44:29 compute-1 nova_compute[185910]: 2026-02-16 13:44:29.706 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5a1cf877-f781-4088-8f98-19d39a95d5bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:44:29 compute-1 podman[213732]: 2026-02-16 13:44:29.921304743 +0000 UTC m=+0.049373218 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:44:29 compute-1 podman[213731]: 2026-02-16 13:44:29.92084131 +0000 UTC m=+0.058117593 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1770267347, architecture=x86_64, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Feb 16 13:44:32 compute-1 nova_compute[185910]: 2026-02-16 13:44:32.089 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:32 compute-1 nova_compute[185910]: 2026-02-16 13:44:32.873 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updating instance_info_cache with network_info: [{"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:44:33 compute-1 nova_compute[185910]: 2026-02-16 13:44:33.886 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:44:33 compute-1 nova_compute[185910]: 2026-02-16 13:44:33.887 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:44:33 compute-1 nova_compute[185910]: 2026-02-16 13:44:33.887 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:44:33 compute-1 nova_compute[185910]: 2026-02-16 13:44:33.888 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:44:34 compute-1 nova_compute[185910]: 2026-02-16 13:44:34.433 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:34 compute-1 nova_compute[185910]: 2026-02-16 13:44:34.769 185914 DEBUG nova.compute.manager [None req-582ddd48-f5ef-4beb-baee-eacd33f589c9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 63898862-3dd6-49b3-9545-63882243296a in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 13:44:34 compute-1 nova_compute[185910]: 2026-02-16 13:44:34.855 185914 DEBUG nova.compute.provider_tree [None req-582ddd48-f5ef-4beb-baee-eacd33f589c9 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 26 to 31 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:44:35 compute-1 podman[195236]: time="2026-02-16T13:44:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:44:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:44:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:44:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:44:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Feb 16 13:44:37 compute-1 nova_compute[185910]: 2026-02-16 13:44:37.091 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:37 compute-1 podman[213772]: 2026-02-16 13:44:37.938096929 +0000 UTC m=+0.081923172 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Feb 16 13:44:39 compute-1 nova_compute[185910]: 2026-02-16 13:44:39.434 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:41 compute-1 sshd-session[213799]: Invalid user postgres from 188.166.42.159 port 56682
Feb 16 13:44:41 compute-1 nova_compute[185910]: 2026-02-16 13:44:41.810 185914 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Check if temp file /var/lib/nova/instances/tmpeh4lji_d exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:44:41 compute-1 nova_compute[185910]: 2026-02-16 13:44:41.811 185914 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeh4lji_d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5a1cf877-f781-4088-8f98-19d39a95d5bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:44:41 compute-1 sshd-session[213799]: Connection closed by invalid user postgres 188.166.42.159 port 56682 [preauth]
Feb 16 13:44:42 compute-1 nova_compute[185910]: 2026-02-16 13:44:42.093 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:42 compute-1 nova_compute[185910]: 2026-02-16 13:44:42.534 185914 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:42 compute-1 nova_compute[185910]: 2026-02-16 13:44:42.605 185914 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:42 compute-1 nova_compute[185910]: 2026-02-16 13:44:42.607 185914 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:44:42 compute-1 nova_compute[185910]: 2026-02-16 13:44:42.655 185914 DEBUG oslo_concurrency.processutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:44:44 compute-1 nova_compute[185910]: 2026-02-16 13:44:44.436 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:45 compute-1 podman[213807]: 2026-02-16 13:44:45.906407284 +0000 UTC m=+0.052770179 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:44:47 compute-1 nova_compute[185910]: 2026-02-16 13:44:47.095 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:47 compute-1 sshd-session[213831]: Accepted publickey for nova from 192.168.122.100 port 56652 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:44:48 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:44:48 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:44:48 compute-1 systemd-logind[821]: New session 42 of user nova.
Feb 16 13:44:48 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:44:48 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:44:48 compute-1 systemd[213835]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:44:48 compute-1 systemd[213835]: Queued start job for default target Main User Target.
Feb 16 13:44:48 compute-1 systemd[213835]: Created slice User Application Slice.
Feb 16 13:44:48 compute-1 systemd[213835]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:44:48 compute-1 systemd[213835]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:44:48 compute-1 systemd[213835]: Reached target Paths.
Feb 16 13:44:48 compute-1 systemd[213835]: Reached target Timers.
Feb 16 13:44:48 compute-1 systemd[213835]: Starting D-Bus User Message Bus Socket...
Feb 16 13:44:48 compute-1 systemd[213835]: Starting Create User's Volatile Files and Directories...
Feb 16 13:44:48 compute-1 systemd[213835]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:44:48 compute-1 systemd[213835]: Reached target Sockets.
Feb 16 13:44:48 compute-1 systemd[213835]: Finished Create User's Volatile Files and Directories.
Feb 16 13:44:48 compute-1 systemd[213835]: Reached target Basic System.
Feb 16 13:44:48 compute-1 systemd[213835]: Reached target Main User Target.
Feb 16 13:44:48 compute-1 systemd[213835]: Startup finished in 146ms.
Feb 16 13:44:48 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:44:48 compute-1 systemd[1]: Started Session 42 of User nova.
Feb 16 13:44:48 compute-1 sshd-session[213831]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:44:48 compute-1 sshd-session[213850]: Received disconnect from 192.168.122.100 port 56652:11: disconnected by user
Feb 16 13:44:48 compute-1 sshd-session[213850]: Disconnected from user nova 192.168.122.100 port 56652
Feb 16 13:44:48 compute-1 sshd-session[213831]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:44:48 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Feb 16 13:44:48 compute-1 systemd-logind[821]: Session 42 logged out. Waiting for processes to exit.
Feb 16 13:44:48 compute-1 systemd-logind[821]: Removed session 42.
Feb 16 13:44:49 compute-1 nova_compute[185910]: 2026-02-16 13:44:49.281 185914 DEBUG nova.compute.manager [req-34939a51-70e1-418f-a6f8-865d17b08c54 req-dbd1e68c-4c2f-4857-ac10-f9b1fbdd9414 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:44:49 compute-1 nova_compute[185910]: 2026-02-16 13:44:49.281 185914 DEBUG oslo_concurrency.lockutils [req-34939a51-70e1-418f-a6f8-865d17b08c54 req-dbd1e68c-4c2f-4857-ac10-f9b1fbdd9414 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:49 compute-1 nova_compute[185910]: 2026-02-16 13:44:49.281 185914 DEBUG oslo_concurrency.lockutils [req-34939a51-70e1-418f-a6f8-865d17b08c54 req-dbd1e68c-4c2f-4857-ac10-f9b1fbdd9414 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:49 compute-1 nova_compute[185910]: 2026-02-16 13:44:49.282 185914 DEBUG oslo_concurrency.lockutils [req-34939a51-70e1-418f-a6f8-865d17b08c54 req-dbd1e68c-4c2f-4857-ac10-f9b1fbdd9414 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:49 compute-1 nova_compute[185910]: 2026-02-16 13:44:49.282 185914 DEBUG nova.compute.manager [req-34939a51-70e1-418f-a6f8-865d17b08c54 req-dbd1e68c-4c2f-4857-ac10-f9b1fbdd9414 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:44:49 compute-1 nova_compute[185910]: 2026-02-16 13:44:49.282 185914 DEBUG nova.compute.manager [req-34939a51-70e1-418f-a6f8-865d17b08c54 req-dbd1e68c-4c2f-4857-ac10-f9b1fbdd9414 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:44:49 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:49.312 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:44:49 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:49.313 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:44:49 compute-1 nova_compute[185910]: 2026-02-16 13:44:49.352 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:49 compute-1 openstack_network_exporter[198096]: ERROR   13:44:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:44:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:44:49 compute-1 openstack_network_exporter[198096]: ERROR   13:44:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:44:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:44:49 compute-1 nova_compute[185910]: 2026-02-16 13:44:49.438 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.151 185914 INFO nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Took 7.49 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.151 185914 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.198 185914 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpeh4lji_d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5a1cf877-f781-4088-8f98-19d39a95d5bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(86a1405a-141f-41b8-ba95-983e499d0ca0),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.241 185914 DEBUG nova.objects.instance [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 5a1cf877-f781-4088-8f98-19d39a95d5bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.243 185914 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.245 185914 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.245 185914 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.284 185914 DEBUG nova.virt.libvirt.vif [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:43:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-444598461',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-444598461',id=19,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:43:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-dq2i0im0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:43:28Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=5a1cf877-f781-4088-8f98-19d39a95d5bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.285 185914 DEBUG nova.network.os_vif_util [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.287 185914 DEBUG nova.network.os_vif_util [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.287 185914 DEBUG nova.virt.libvirt.migration [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:44:50 compute-1 nova_compute[185910]:   <mac address="fa:16:3e:e4:03:04"/>
Feb 16 13:44:50 compute-1 nova_compute[185910]:   <model type="virtio"/>
Feb 16 13:44:50 compute-1 nova_compute[185910]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:44:50 compute-1 nova_compute[185910]:   <mtu size="1442"/>
Feb 16 13:44:50 compute-1 nova_compute[185910]:   <target dev="tapb5736eee-a7"/>
Feb 16 13:44:50 compute-1 nova_compute[185910]: </interface>
Feb 16 13:44:50 compute-1 nova_compute[185910]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.288 185914 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.749 185914 DEBUG nova.virt.libvirt.migration [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.749 185914 INFO nova.virt.libvirt.migration [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:44:50 compute-1 nova_compute[185910]: 2026-02-16 13:44:50.853 185914 INFO nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.356 185914 DEBUG nova.virt.libvirt.migration [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.357 185914 DEBUG nova.virt.libvirt.migration [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.865 185914 DEBUG nova.virt.libvirt.migration [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.866 185914 DEBUG nova.virt.libvirt.migration [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.960 185914 DEBUG nova.compute.manager [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.961 185914 DEBUG oslo_concurrency.lockutils [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.961 185914 DEBUG oslo_concurrency.lockutils [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.961 185914 DEBUG oslo_concurrency.lockutils [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.962 185914 DEBUG nova.compute.manager [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.962 185914 WARNING nova.compute.manager [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received unexpected event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with vm_state active and task_state migrating.
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.962 185914 DEBUG nova.compute.manager [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-changed-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.962 185914 DEBUG nova.compute.manager [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Refreshing instance network info cache due to event network-changed-b5736eee-a7c7-4376-87f8-2ba8e852813f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.962 185914 DEBUG oslo_concurrency.lockutils [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.962 185914 DEBUG oslo_concurrency.lockutils [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:44:51 compute-1 nova_compute[185910]: 2026-02-16 13:44:51.962 185914 DEBUG nova.network.neutron [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Refreshing network info cache for port b5736eee-a7c7-4376-87f8-2ba8e852813f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.097 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.279 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249492.2787235, 5a1cf877-f781-4088-8f98-19d39a95d5bc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.280 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] VM Paused (Lifecycle Event)
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.302 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.306 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.339 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.370 185914 DEBUG nova.virt.libvirt.migration [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.370 185914 DEBUG nova.virt.libvirt.migration [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:44:52 compute-1 kernel: tapb5736eee-a7 (unregistering): left promiscuous mode
Feb 16 13:44:52 compute-1 NetworkManager[56388]: <info>  [1771249492.4098] device (tapb5736eee-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:44:52 compute-1 ovn_controller[96285]: 2026-02-16T13:44:52Z|00152|binding|INFO|Releasing lport b5736eee-a7c7-4376-87f8-2ba8e852813f from this chassis (sb_readonly=0)
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.414 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:52 compute-1 ovn_controller[96285]: 2026-02-16T13:44:52Z|00153|binding|INFO|Setting lport b5736eee-a7c7-4376-87f8-2ba8e852813f down in Southbound
Feb 16 13:44:52 compute-1 ovn_controller[96285]: 2026-02-16T13:44:52Z|00154|binding|INFO|Removing iface tapb5736eee-a7 ovn-installed in OVS
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.423 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.426 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:03:04 10.100.0.11'], port_security=['fa:16:3e:e4:03:04 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5a1cf877-f781-4088-8f98-19d39a95d5bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=b5736eee-a7c7-4376-87f8-2ba8e852813f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.430 105573 INFO neutron.agent.ovn.metadata.agent [-] Port b5736eee-a7c7-4376-87f8-2ba8e852813f in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.432 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.435 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[741e0c65-a270-418d-9795-63580e92d51c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.436 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:44:52 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Deactivated successfully.
Feb 16 13:44:52 compute-1 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Consumed 16.699s CPU time.
Feb 16 13:44:52 compute-1 systemd-machined[155419]: Machine qemu-13-instance-00000013 terminated.
Feb 16 13:44:52 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213539]: [NOTICE]   (213543) : haproxy version is 2.8.14-c23fe91
Feb 16 13:44:52 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213539]: [NOTICE]   (213543) : path to executable is /usr/sbin/haproxy
Feb 16 13:44:52 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213539]: [WARNING]  (213543) : Exiting Master process...
Feb 16 13:44:52 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213539]: [ALERT]    (213543) : Current worker (213545) exited with code 143 (Terminated)
Feb 16 13:44:52 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[213539]: [WARNING]  (213543) : All workers exited. Exiting... (0)
Feb 16 13:44:52 compute-1 systemd[1]: libpod-a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b.scope: Deactivated successfully.
Feb 16 13:44:52 compute-1 podman[213894]: 2026-02-16 13:44:52.570522854 +0000 UTC m=+0.047118976 container died a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:44:52 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b-userdata-shm.mount: Deactivated successfully.
Feb 16 13:44:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-1bf39e1145315fdb3abb059ca342acd7d2c0b6d8ec7141c06c24ecc0f1acd5fe-merged.mount: Deactivated successfully.
Feb 16 13:44:52 compute-1 podman[213894]: 2026-02-16 13:44:52.628482843 +0000 UTC m=+0.105078975 container cleanup a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 16 13:44:52 compute-1 systemd[1]: libpod-conmon-a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b.scope: Deactivated successfully.
Feb 16 13:44:52 compute-1 podman[213936]: 2026-02-16 13:44:52.689099713 +0000 UTC m=+0.042207523 container remove a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.694 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[44289fd7-7bef-414a-b924-e2b7830a7d75]: (4, ('Mon Feb 16 01:44:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b)\na2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b\nMon Feb 16 01:44:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (a2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b)\na2168a1373ca2171e02447d9dd7b933395a8f7cc41f9da67dd262eaafe18912b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.696 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c30005df-f1f4-424a-ba56-ae75159d65ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.697 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.700 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:52 compute-1 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.707 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.712 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[9c46acfa-a2af-4206-95cf-50c155619960]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:52 compute-1 sshd-session[213868]: Invalid user test from 146.190.226.24 port 33408
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.726 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a0171abe-4140-4a9c-947a-99683a210742]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.728 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ee639fc0-8990-40a5-9f57-740a46dfd9d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.743 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8e4ea1dd-01c3-465b-b3a4-9880ca4b7b02]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549231, 'reachable_time': 39040, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213958, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:52 compute-1 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.747 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:44:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:52.747 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[c940e7d3-8587-4124-97b1-45fd482f8ee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:44:52 compute-1 sshd-session[213868]: Connection closed by invalid user test 146.190.226.24 port 33408 [preauth]
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.983 185914 DEBUG nova.compute.manager [req-b640e661-33eb-493c-9480-338a0755a8df req-abc72ab9-a7b7-40ba-9136-685ab1d59b59 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.983 185914 DEBUG oslo_concurrency.lockutils [req-b640e661-33eb-493c-9480-338a0755a8df req-abc72ab9-a7b7-40ba-9136-685ab1d59b59 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.984 185914 DEBUG oslo_concurrency.lockutils [req-b640e661-33eb-493c-9480-338a0755a8df req-abc72ab9-a7b7-40ba-9136-685ab1d59b59 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.984 185914 DEBUG oslo_concurrency.lockutils [req-b640e661-33eb-493c-9480-338a0755a8df req-abc72ab9-a7b7-40ba-9136-685ab1d59b59 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.984 185914 DEBUG nova.compute.manager [req-b640e661-33eb-493c-9480-338a0755a8df req-abc72ab9-a7b7-40ba-9136-685ab1d59b59 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:44:52 compute-1 nova_compute[185910]: 2026-02-16 13:44:52.984 185914 DEBUG nova.compute.manager [req-b640e661-33eb-493c-9480-338a0755a8df req-abc72ab9-a7b7-40ba-9136-685ab1d59b59 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:44:53 compute-1 nova_compute[185910]: 2026-02-16 13:44:53.061 185914 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:44:53 compute-1 nova_compute[185910]: 2026-02-16 13:44:53.062 185914 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:44:53 compute-1 nova_compute[185910]: 2026-02-16 13:44:53.062 185914 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:44:53 compute-1 nova_compute[185910]: 2026-02-16 13:44:53.062 185914 DEBUG nova.virt.libvirt.guest [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '5a1cf877-f781-4088-8f98-19d39a95d5bc' (instance-00000013) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:44:53 compute-1 nova_compute[185910]: 2026-02-16 13:44:53.062 185914 INFO nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Migration operation has completed
Feb 16 13:44:53 compute-1 nova_compute[185910]: 2026-02-16 13:44:53.063 185914 INFO nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] _post_live_migration() is started..
Feb 16 13:44:54 compute-1 nova_compute[185910]: 2026-02-16 13:44:54.439 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.112 185914 DEBUG nova.compute.manager [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.113 185914 DEBUG oslo_concurrency.lockutils [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.113 185914 DEBUG oslo_concurrency.lockutils [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.114 185914 DEBUG oslo_concurrency.lockutils [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.114 185914 DEBUG nova.compute.manager [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.114 185914 WARNING nova.compute.manager [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received unexpected event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with vm_state active and task_state migrating.
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.115 185914 DEBUG nova.compute.manager [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.115 185914 DEBUG oslo_concurrency.lockutils [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.115 185914 DEBUG oslo_concurrency.lockutils [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.115 185914 DEBUG oslo_concurrency.lockutils [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.116 185914 DEBUG nova.compute.manager [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.116 185914 WARNING nova.compute.manager [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received unexpected event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with vm_state active and task_state migrating.
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.116 185914 DEBUG nova.compute.manager [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.117 185914 DEBUG oslo_concurrency.lockutils [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.117 185914 DEBUG oslo_concurrency.lockutils [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.117 185914 DEBUG oslo_concurrency.lockutils [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.117 185914 DEBUG nova.compute.manager [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.118 185914 DEBUG nova.compute.manager [req-ed1a59fe-cfac-4dd7-890c-cd4fe9b5cf58 req-42eca49e-46e7-4c23-8751-37766ad229fd faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-unplugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.784 185914 DEBUG nova.network.neutron [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updated VIF entry in instance network info cache for port b5736eee-a7c7-4376-87f8-2ba8e852813f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:44:55 compute-1 nova_compute[185910]: 2026-02-16 13:44:55.785 185914 DEBUG nova.network.neutron [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Updating instance_info_cache with network_info: [{"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.020 185914 DEBUG nova.network.neutron [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port b5736eee-a7c7-4376-87f8-2ba8e852813f and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.020 185914 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.021 185914 DEBUG nova.virt.libvirt.vif [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:43:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-444598461',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-444598461',id=19,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:43:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-dq2i0im0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:44:37Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=5a1cf877-f781-4088-8f98-19d39a95d5bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.021 185914 DEBUG nova.network.os_vif_util [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "address": "fa:16:3e:e4:03:04", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5736eee-a7", "ovs_interfaceid": "b5736eee-a7c7-4376-87f8-2ba8e852813f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.022 185914 DEBUG nova.network.os_vif_util [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.022 185914 DEBUG os_vif [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.024 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.024 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5736eee-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.026 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.027 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.030 185914 INFO os_vif [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:03:04,bridge_name='br-int',has_traffic_filtering=True,id=b5736eee-a7c7-4376-87f8-2ba8e852813f,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5736eee-a7')
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.030 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.031 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.031 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.031 185914 DEBUG nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.032 185914 INFO nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Deleting instance files /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc_del
Feb 16 13:44:56 compute-1 nova_compute[185910]: 2026-02-16 13:44:56.032 185914 INFO nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Deletion of /var/lib/nova/instances/5a1cf877-f781-4088-8f98-19d39a95d5bc_del complete
Feb 16 13:44:56 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:44:56.315 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.303 185914 DEBUG nova.compute.manager [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.303 185914 DEBUG oslo_concurrency.lockutils [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.303 185914 DEBUG oslo_concurrency.lockutils [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.304 185914 DEBUG oslo_concurrency.lockutils [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.304 185914 DEBUG nova.compute.manager [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.304 185914 WARNING nova.compute.manager [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received unexpected event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with vm_state active and task_state migrating.
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.305 185914 DEBUG nova.compute.manager [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.305 185914 DEBUG oslo_concurrency.lockutils [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.305 185914 DEBUG oslo_concurrency.lockutils [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.306 185914 DEBUG oslo_concurrency.lockutils [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.306 185914 DEBUG nova.compute.manager [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] No waiting events found dispatching network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.306 185914 WARNING nova.compute.manager [req-e9ae7d48-869e-4970-aad8-c56bf246b8fc req-8b8eeb3e-d21f-4513-9f9c-20bf7ea8e208 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Received unexpected event network-vif-plugged-b5736eee-a7c7-4376-87f8-2ba8e852813f for instance with vm_state active and task_state migrating.
Feb 16 13:44:58 compute-1 nova_compute[185910]: 2026-02-16 13:44:58.329 185914 DEBUG oslo_concurrency.lockutils [req-fd12c0a5-2232-433f-8aed-c5d1354c5db5 req-f4a92d38-691e-4c15-a026-f27861f33332 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-5a1cf877-f781-4088-8f98-19d39a95d5bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:44:58 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:44:58 compute-1 systemd[213835]: Activating special unit Exit the Session...
Feb 16 13:44:58 compute-1 systemd[213835]: Stopped target Main User Target.
Feb 16 13:44:58 compute-1 systemd[213835]: Stopped target Basic System.
Feb 16 13:44:58 compute-1 systemd[213835]: Stopped target Paths.
Feb 16 13:44:58 compute-1 systemd[213835]: Stopped target Sockets.
Feb 16 13:44:58 compute-1 systemd[213835]: Stopped target Timers.
Feb 16 13:44:58 compute-1 systemd[213835]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:44:58 compute-1 systemd[213835]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:44:58 compute-1 systemd[213835]: Closed D-Bus User Message Bus Socket.
Feb 16 13:44:58 compute-1 systemd[213835]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:44:58 compute-1 systemd[213835]: Removed slice User Application Slice.
Feb 16 13:44:58 compute-1 systemd[213835]: Reached target Shutdown.
Feb 16 13:44:58 compute-1 systemd[213835]: Finished Exit the Session.
Feb 16 13:44:58 compute-1 systemd[213835]: Reached target Exit the Session.
Feb 16 13:44:58 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:44:58 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:44:58 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:44:58 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:44:58 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:44:58 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:44:58 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:44:59 compute-1 nova_compute[185910]: 2026-02-16 13:44:59.442 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:00 compute-1 podman[213961]: 2026-02-16 13:45:00.920062573 +0000 UTC m=+0.060608551 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Feb 16 13:45:00 compute-1 podman[213960]: 2026-02-16 13:45:00.921355228 +0000 UTC m=+0.061689260 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible)
Feb 16 13:45:01 compute-1 nova_compute[185910]: 2026-02-16 13:45:01.027 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:45:03.355 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:45:03.355 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:45:03.356 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:04 compute-1 nova_compute[185910]: 2026-02-16 13:45:04.443 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:05 compute-1 podman[195236]: time="2026-02-16T13:45:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:45:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:45:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:45:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:45:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 16 13:45:05 compute-1 nova_compute[185910]: 2026-02-16 13:45:05.800 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:05 compute-1 nova_compute[185910]: 2026-02-16 13:45:05.800 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:05 compute-1 nova_compute[185910]: 2026-02-16 13:45:05.800 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "5a1cf877-f781-4088-8f98-19d39a95d5bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:05 compute-1 nova_compute[185910]: 2026-02-16 13:45:05.840 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:05 compute-1 nova_compute[185910]: 2026-02-16 13:45:05.840 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:05 compute-1 nova_compute[185910]: 2026-02-16 13:45:05.840 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:05 compute-1 nova_compute[185910]: 2026-02-16 13:45:05.841 185914 DEBUG nova.compute.resource_tracker [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.018 185914 WARNING nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.020 185914 DEBUG nova.compute.resource_tracker [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5786MB free_disk=73.22355651855469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.020 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.021 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.032 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.081 185914 DEBUG nova.compute.resource_tracker [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance 5a1cf877-f781-4088-8f98-19d39a95d5bc refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.123 185914 DEBUG nova.compute.resource_tracker [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.165 185914 DEBUG nova.compute.resource_tracker [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration 86a1405a-141f-41b8-ba95-983e499d0ca0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.166 185914 DEBUG nova.compute.resource_tracker [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.167 185914 DEBUG nova.compute.resource_tracker [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.227 185914 DEBUG nova.compute.provider_tree [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.250 185914 DEBUG nova.scheduler.client.report [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.294 185914 DEBUG nova.compute.resource_tracker [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.295 185914 DEBUG oslo_concurrency.lockutils [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.302 185914 INFO nova.compute.manager [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.475 185914 INFO nova.scheduler.client.report [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 86a1405a-141f-41b8-ba95-983e499d0ca0
Feb 16 13:45:06 compute-1 nova_compute[185910]: 2026-02-16 13:45:06.476 185914 DEBUG nova.virt.libvirt.driver [None req-665a2939-c829-4159-b077-d3b1b28b59e7 bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:45:07 compute-1 nova_compute[185910]: 2026-02-16 13:45:07.653 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249492.6477268, 5a1cf877-f781-4088-8f98-19d39a95d5bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:45:07 compute-1 nova_compute[185910]: 2026-02-16 13:45:07.653 185914 INFO nova.compute.manager [-] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] VM Stopped (Lifecycle Event)
Feb 16 13:45:07 compute-1 nova_compute[185910]: 2026-02-16 13:45:07.732 185914 DEBUG nova.compute.manager [None req-ed2e5e21-656e-49aa-afc6-9be2b546c9e8 - - - - - -] [instance: 5a1cf877-f781-4088-8f98-19d39a95d5bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:45:08 compute-1 podman[214001]: 2026-02-16 13:45:08.970904971 +0000 UTC m=+0.103743259 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:45:09 compute-1 nova_compute[185910]: 2026-02-16 13:45:09.448 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:11 compute-1 nova_compute[185910]: 2026-02-16 13:45:11.085 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:14 compute-1 nova_compute[185910]: 2026-02-16 13:45:14.451 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:16 compute-1 nova_compute[185910]: 2026-02-16 13:45:16.089 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:16 compute-1 podman[214027]: 2026-02-16 13:45:16.913102647 +0000 UTC m=+0.046511940 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:45:19 compute-1 openstack_network_exporter[198096]: ERROR   13:45:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:45:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:45:19 compute-1 openstack_network_exporter[198096]: ERROR   13:45:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:45:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:45:19 compute-1 nova_compute[185910]: 2026-02-16 13:45:19.452 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:20 compute-1 nova_compute[185910]: 2026-02-16 13:45:20.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:20 compute-1 nova_compute[185910]: 2026-02-16 13:45:20.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:21 compute-1 nova_compute[185910]: 2026-02-16 13:45:21.092 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:21 compute-1 nova_compute[185910]: 2026-02-16 13:45:21.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:22 compute-1 nova_compute[185910]: 2026-02-16 13:45:22.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.659 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.660 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.660 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.660 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.811 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.813 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5801MB free_disk=73.22355651855469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.813 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:45:23 compute-1 nova_compute[185910]: 2026-02-16 13:45:23.813 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:45:24 compute-1 nova_compute[185910]: 2026-02-16 13:45:24.136 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:45:24 compute-1 nova_compute[185910]: 2026-02-16 13:45:24.137 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:45:24 compute-1 nova_compute[185910]: 2026-02-16 13:45:24.159 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:45:24 compute-1 nova_compute[185910]: 2026-02-16 13:45:24.178 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:45:24 compute-1 nova_compute[185910]: 2026-02-16 13:45:24.179 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:45:24 compute-1 nova_compute[185910]: 2026-02-16 13:45:24.180 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:45:24 compute-1 nova_compute[185910]: 2026-02-16 13:45:24.455 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:26 compute-1 nova_compute[185910]: 2026-02-16 13:45:26.095 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:28 compute-1 nova_compute[185910]: 2026-02-16 13:45:28.174 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:28 compute-1 nova_compute[185910]: 2026-02-16 13:45:28.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:28 compute-1 nova_compute[185910]: 2026-02-16 13:45:28.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:45:28 compute-1 nova_compute[185910]: 2026-02-16 13:45:28.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:45:28 compute-1 nova_compute[185910]: 2026-02-16 13:45:28.669 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:45:29 compute-1 nova_compute[185910]: 2026-02-16 13:45:29.457 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:31 compute-1 nova_compute[185910]: 2026-02-16 13:45:31.145 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:31 compute-1 nova_compute[185910]: 2026-02-16 13:45:31.663 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:31 compute-1 podman[214051]: 2026-02-16 13:45:31.910814686 +0000 UTC m=+0.051123934 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=)
Feb 16 13:45:31 compute-1 podman[214052]: 2026-02-16 13:45:31.927903669 +0000 UTC m=+0.058378291 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:45:33 compute-1 nova_compute[185910]: 2026-02-16 13:45:33.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:45:33 compute-1 nova_compute[185910]: 2026-02-16 13:45:33.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:45:34 compute-1 nova_compute[185910]: 2026-02-16 13:45:34.460 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:34 compute-1 sshd-session[214092]: Invalid user postgres from 188.166.42.159 port 40272
Feb 16 13:45:35 compute-1 sshd-session[214092]: Connection closed by invalid user postgres 188.166.42.159 port 40272 [preauth]
Feb 16 13:45:35 compute-1 podman[195236]: time="2026-02-16T13:45:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:45:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:45:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:45:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:45:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 16 13:45:36 compute-1 nova_compute[185910]: 2026-02-16 13:45:36.149 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:39 compute-1 nova_compute[185910]: 2026-02-16 13:45:39.462 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:39 compute-1 podman[214094]: 2026-02-16 13:45:39.981883401 +0000 UTC m=+0.112465124 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:45:41 compute-1 nova_compute[185910]: 2026-02-16 13:45:41.152 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:43 compute-1 ovn_controller[96285]: 2026-02-16T13:45:43Z|00155|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 16 13:45:44 compute-1 nova_compute[185910]: 2026-02-16 13:45:44.465 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:46 compute-1 nova_compute[185910]: 2026-02-16 13:45:46.155 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:47 compute-1 podman[214122]: 2026-02-16 13:45:47.931970772 +0000 UTC m=+0.068064683 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:45:49 compute-1 openstack_network_exporter[198096]: ERROR   13:45:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:45:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:45:49 compute-1 openstack_network_exporter[198096]: ERROR   13:45:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:45:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:45:49 compute-1 nova_compute[185910]: 2026-02-16 13:45:49.469 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:51 compute-1 nova_compute[185910]: 2026-02-16 13:45:51.158 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:54 compute-1 nova_compute[185910]: 2026-02-16 13:45:54.471 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:55 compute-1 sshd-session[214146]: Invalid user eigen from 2.57.122.210 port 40626
Feb 16 13:45:55 compute-1 sshd-session[214146]: Connection closed by invalid user eigen 2.57.122.210 port 40626 [preauth]
Feb 16 13:45:56 compute-1 nova_compute[185910]: 2026-02-16 13:45:56.162 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:45:59 compute-1 nova_compute[185910]: 2026-02-16 13:45:59.474 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:01 compute-1 nova_compute[185910]: 2026-02-16 13:46:01.166 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:02 compute-1 podman[214148]: 2026-02-16 13:46:02.933268277 +0000 UTC m=+0.069605095 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 16 13:46:02 compute-1 podman[214149]: 2026-02-16 13:46:02.945170109 +0000 UTC m=+0.082344640 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 13:46:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:03.356 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:03.357 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:03.357 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:04 compute-1 sshd-session[214187]: Invalid user test from 146.190.226.24 port 34756
Feb 16 13:46:04 compute-1 sshd-session[214187]: Connection closed by invalid user test 146.190.226.24 port 34756 [preauth]
Feb 16 13:46:04 compute-1 nova_compute[185910]: 2026-02-16 13:46:04.476 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:05 compute-1 podman[195236]: time="2026-02-16T13:46:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:46:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:46:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:46:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:46:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Feb 16 13:46:06 compute-1 nova_compute[185910]: 2026-02-16 13:46:06.168 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:09 compute-1 nova_compute[185910]: 2026-02-16 13:46:09.478 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:10 compute-1 podman[214190]: 2026-02-16 13:46:10.926541196 +0000 UTC m=+0.070164129 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:46:11 compute-1 nova_compute[185910]: 2026-02-16 13:46:11.170 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:14 compute-1 nova_compute[185910]: 2026-02-16 13:46:14.480 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:16 compute-1 nova_compute[185910]: 2026-02-16 13:46:16.174 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:18 compute-1 nova_compute[185910]: 2026-02-16 13:46:18.623 185914 DEBUG nova.compute.manager [None req-0eaba42d-ec34-4c69-ae62-364cecb3c836 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 63898862-3dd6-49b3-9545-63882243296a in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 13:46:18 compute-1 nova_compute[185910]: 2026-02-16 13:46:18.705 185914 DEBUG nova.compute.provider_tree [None req-0eaba42d-ec34-4c69-ae62-364cecb3c836 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Updating resource provider 63898862-3dd6-49b3-9545-63882243296a generation from 31 to 34 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 13:46:19 compute-1 nova_compute[185910]: 2026-02-16 13:46:19.615 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:19 compute-1 openstack_network_exporter[198096]: ERROR   13:46:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:46:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:46:19 compute-1 openstack_network_exporter[198096]: ERROR   13:46:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:46:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:46:19 compute-1 podman[214217]: 2026-02-16 13:46:19.682638656 +0000 UTC m=+0.057080806 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:46:21 compute-1 nova_compute[185910]: 2026-02-16 13:46:21.178 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:21 compute-1 nova_compute[185910]: 2026-02-16 13:46:21.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:21 compute-1 nova_compute[185910]: 2026-02-16 13:46:21.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:23 compute-1 nova_compute[185910]: 2026-02-16 13:46:23.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:23 compute-1 nova_compute[185910]: 2026-02-16 13:46:23.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:24 compute-1 nova_compute[185910]: 2026-02-16 13:46:24.619 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.683 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.684 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.684 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.684 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.856 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.858 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5814MB free_disk=73.22324752807617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.859 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.859 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.913 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.914 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.929 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.986 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:46:25 compute-1 nova_compute[185910]: 2026-02-16 13:46:25.987 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:46:26 compute-1 nova_compute[185910]: 2026-02-16 13:46:26.002 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:46:26 compute-1 nova_compute[185910]: 2026-02-16 13:46:26.022 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:46:26 compute-1 nova_compute[185910]: 2026-02-16 13:46:26.045 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:46:26 compute-1 nova_compute[185910]: 2026-02-16 13:46:26.065 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:46:26 compute-1 nova_compute[185910]: 2026-02-16 13:46:26.067 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:46:26 compute-1 nova_compute[185910]: 2026-02-16 13:46:26.067 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:26 compute-1 nova_compute[185910]: 2026-02-16 13:46:26.181 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:27 compute-1 nova_compute[185910]: 2026-02-16 13:46:27.840 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:27.840 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:46:27 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:27.841 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:46:28 compute-1 nova_compute[185910]: 2026-02-16 13:46:28.062 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:28 compute-1 sshd-session[214241]: Invalid user postgres from 188.166.42.159 port 58182
Feb 16 13:46:28 compute-1 sshd-session[214241]: Connection closed by invalid user postgres 188.166.42.159 port 58182 [preauth]
Feb 16 13:46:28 compute-1 nova_compute[185910]: 2026-02-16 13:46:28.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:28 compute-1 nova_compute[185910]: 2026-02-16 13:46:28.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:46:28 compute-1 nova_compute[185910]: 2026-02-16 13:46:28.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:46:28 compute-1 nova_compute[185910]: 2026-02-16 13:46:28.668 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:46:29 compute-1 nova_compute[185910]: 2026-02-16 13:46:29.622 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:31 compute-1 nova_compute[185910]: 2026-02-16 13:46:31.184 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:33 compute-1 podman[214244]: 2026-02-16 13:46:33.937148444 +0000 UTC m=+0.062626636 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 16 13:46:33 compute-1 podman[214243]: 2026-02-16 13:46:33.942576211 +0000 UTC m=+0.072827492 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 13:46:34 compute-1 nova_compute[185910]: 2026-02-16 13:46:34.623 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:34 compute-1 nova_compute[185910]: 2026-02-16 13:46:34.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:46:34 compute-1 nova_compute[185910]: 2026-02-16 13:46:34.630 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:46:35 compute-1 podman[195236]: time="2026-02-16T13:46:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:46:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:46:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:46:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:46:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 13:46:35 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:35.844 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:36 compute-1 nova_compute[185910]: 2026-02-16 13:46:36.187 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:39 compute-1 nova_compute[185910]: 2026-02-16 13:46:39.624 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:41 compute-1 nova_compute[185910]: 2026-02-16 13:46:41.190 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:42 compute-1 podman[214282]: 2026-02-16 13:46:42.024147021 +0000 UTC m=+0.163697870 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Feb 16 13:46:44 compute-1 nova_compute[185910]: 2026-02-16 13:46:44.628 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:46 compute-1 nova_compute[185910]: 2026-02-16 13:46:46.194 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:47 compute-1 nova_compute[185910]: 2026-02-16 13:46:47.536 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:47 compute-1 nova_compute[185910]: 2026-02-16 13:46:47.536 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:47 compute-1 nova_compute[185910]: 2026-02-16 13:46:47.562 185914 DEBUG nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:46:47 compute-1 nova_compute[185910]: 2026-02-16 13:46:47.819 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:47 compute-1 nova_compute[185910]: 2026-02-16 13:46:47.819 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:47 compute-1 nova_compute[185910]: 2026-02-16 13:46:47.828 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:46:47 compute-1 nova_compute[185910]: 2026-02-16 13:46:47.829 185914 INFO nova.compute.claims [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:47.999 185914 DEBUG nova.compute.provider_tree [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.014 185914 DEBUG nova.scheduler.client.report [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.043 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.044 185914 DEBUG nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.107 185914 DEBUG nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.108 185914 DEBUG nova.network.neutron [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.130 185914 INFO nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.156 185914 DEBUG nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.347 185914 DEBUG nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.349 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.350 185914 INFO nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Creating image(s)
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.351 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.352 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.353 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.382 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.436 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.437 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.438 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.448 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.499 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.500 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.543 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.544 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.544 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.621 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.623 185914 DEBUG nova.virt.disk.api [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.623 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.684 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.685 185914 DEBUG nova.virt.disk.api [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.686 185914 DEBUG nova.objects.instance [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 3217baa5-9eb7-414f-b18a-c49217ace9b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.707 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.707 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Ensure instance console log exists: /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.708 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.708 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:48 compute-1 nova_compute[185910]: 2026-02-16 13:46:48.708 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:49 compute-1 nova_compute[185910]: 2026-02-16 13:46:49.028 185914 DEBUG nova.policy [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:46:49 compute-1 openstack_network_exporter[198096]: ERROR   13:46:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:46:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:46:49 compute-1 openstack_network_exporter[198096]: ERROR   13:46:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:46:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:46:49 compute-1 nova_compute[185910]: 2026-02-16 13:46:49.629 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:49 compute-1 podman[214324]: 2026-02-16 13:46:49.926646342 +0000 UTC m=+0.060549190 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:46:50 compute-1 nova_compute[185910]: 2026-02-16 13:46:50.847 185914 DEBUG nova.network.neutron [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Successfully created port: 0ec2c49b-401e-4ba2-8344-3d943b18845b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:46:51 compute-1 nova_compute[185910]: 2026-02-16 13:46:51.198 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:52 compute-1 nova_compute[185910]: 2026-02-16 13:46:52.034 185914 DEBUG nova.network.neutron [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Successfully updated port: 0ec2c49b-401e-4ba2-8344-3d943b18845b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:46:52 compute-1 nova_compute[185910]: 2026-02-16 13:46:52.058 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:46:52 compute-1 nova_compute[185910]: 2026-02-16 13:46:52.058 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:46:52 compute-1 nova_compute[185910]: 2026-02-16 13:46:52.059 185914 DEBUG nova.network.neutron [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:46:52 compute-1 nova_compute[185910]: 2026-02-16 13:46:52.183 185914 DEBUG nova.compute.manager [req-cb2e40c5-1ca9-486c-9729-e534bb339866 req-4c371d37-723d-4b53-a0f1-e63a4b097012 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-changed-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:46:52 compute-1 nova_compute[185910]: 2026-02-16 13:46:52.184 185914 DEBUG nova.compute.manager [req-cb2e40c5-1ca9-486c-9729-e534bb339866 req-4c371d37-723d-4b53-a0f1-e63a4b097012 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Refreshing instance network info cache due to event network-changed-0ec2c49b-401e-4ba2-8344-3d943b18845b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:46:52 compute-1 nova_compute[185910]: 2026-02-16 13:46:52.184 185914 DEBUG oslo_concurrency.lockutils [req-cb2e40c5-1ca9-486c-9729-e534bb339866 req-4c371d37-723d-4b53-a0f1-e63a4b097012 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:46:52 compute-1 nova_compute[185910]: 2026-02-16 13:46:52.241 185914 DEBUG nova.network.neutron [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.148 185914 DEBUG nova.network.neutron [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updating instance_info_cache with network_info: [{"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.169 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.170 185914 DEBUG nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Instance network_info: |[{"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.171 185914 DEBUG oslo_concurrency.lockutils [req-cb2e40c5-1ca9-486c-9729-e534bb339866 req-4c371d37-723d-4b53-a0f1-e63a4b097012 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.172 185914 DEBUG nova.network.neutron [req-cb2e40c5-1ca9-486c-9729-e534bb339866 req-4c371d37-723d-4b53-a0f1-e63a4b097012 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Refreshing network info cache for port 0ec2c49b-401e-4ba2-8344-3d943b18845b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.176 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Start _get_guest_xml network_info=[{"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.183 185914 WARNING nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.188 185914 DEBUG nova.virt.libvirt.host [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.189 185914 DEBUG nova.virt.libvirt.host [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.192 185914 DEBUG nova.virt.libvirt.host [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.193 185914 DEBUG nova.virt.libvirt.host [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.195 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.196 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.196 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.197 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.197 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.198 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.198 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.199 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.199 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.199 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.200 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.200 185914 DEBUG nova.virt.hardware [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.207 185914 DEBUG nova.virt.libvirt.vif [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:46:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1517311993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1517311993',id=22,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4pvdm8hl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:46:48Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3217baa5-9eb7-414f-b18a-c49217ace9b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.208 185914 DEBUG nova.network.os_vif_util [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.209 185914 DEBUG nova.network.os_vif_util [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.212 185914 DEBUG nova.objects.instance [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3217baa5-9eb7-414f-b18a-c49217ace9b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.233 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <uuid>3217baa5-9eb7-414f-b18a-c49217ace9b6</uuid>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <name>instance-00000016</name>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteStrategies-server-1517311993</nova:name>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:46:54</nova:creationTime>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:46:54 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:46:54 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:46:54 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:46:54 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:46:54 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:46:54 compute-1 nova_compute[185910]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:46:54 compute-1 nova_compute[185910]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:46:54 compute-1 nova_compute[185910]:         <nova:port uuid="0ec2c49b-401e-4ba2-8344-3d943b18845b">
Feb 16 13:46:54 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <system>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <entry name="serial">3217baa5-9eb7-414f-b18a-c49217ace9b6</entry>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <entry name="uuid">3217baa5-9eb7-414f-b18a-c49217ace9b6</entry>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     </system>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <os>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   </os>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <features>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   </features>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:34:09:3f"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <target dev="tap0ec2c49b-40"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/console.log" append="off"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <video>
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     </video>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:46:54 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:46:54 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:46:54 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:46:54 compute-1 nova_compute[185910]: </domain>
Feb 16 13:46:54 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.235 185914 DEBUG nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Preparing to wait for external event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.236 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.236 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.237 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.238 185914 DEBUG nova.virt.libvirt.vif [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:46:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1517311993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1517311993',id=22,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4pvdm8hl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:46:48Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3217baa5-9eb7-414f-b18a-c49217ace9b6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.239 185914 DEBUG nova.network.os_vif_util [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.240 185914 DEBUG nova.network.os_vif_util [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.240 185914 DEBUG os_vif [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.242 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.243 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.243 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.248 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.248 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ec2c49b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.249 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ec2c49b-40, col_values=(('external_ids', {'iface-id': '0ec2c49b-401e-4ba2-8344-3d943b18845b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:09:3f', 'vm-uuid': '3217baa5-9eb7-414f-b18a-c49217ace9b6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.252 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:54 compute-1 NetworkManager[56388]: <info>  [1771249614.2549] manager: (tap0ec2c49b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.255 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.261 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.263 185914 INFO os_vif [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40')
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.337 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.338 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.338 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:34:09:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.339 185914 INFO nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Using config drive
Feb 16 13:46:54 compute-1 nova_compute[185910]: 2026-02-16 13:46:54.632 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.077 185914 INFO nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Creating config drive at /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.081 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpufi51d60 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.208 185914 DEBUG oslo_concurrency.processutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpufi51d60" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:46:55 compute-1 kernel: tap0ec2c49b-40: entered promiscuous mode
Feb 16 13:46:55 compute-1 NetworkManager[56388]: <info>  [1771249615.2716] manager: (tap0ec2c49b-40): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Feb 16 13:46:55 compute-1 ovn_controller[96285]: 2026-02-16T13:46:55Z|00156|binding|INFO|Claiming lport 0ec2c49b-401e-4ba2-8344-3d943b18845b for this chassis.
Feb 16 13:46:55 compute-1 ovn_controller[96285]: 2026-02-16T13:46:55Z|00157|binding|INFO|0ec2c49b-401e-4ba2-8344-3d943b18845b: Claiming fa:16:3e:34:09:3f 10.100.0.4
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.272 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:55 compute-1 ovn_controller[96285]: 2026-02-16T13:46:55Z|00158|binding|INFO|Setting lport 0ec2c49b-401e-4ba2-8344-3d943b18845b ovn-installed in OVS
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.278 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:55 compute-1 ovn_controller[96285]: 2026-02-16T13:46:55Z|00159|binding|INFO|Setting lport 0ec2c49b-401e-4ba2-8344-3d943b18845b up in Southbound
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.286 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:09:3f 10.100.0.4'], port_security=['fa:16:3e:34:09:3f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3217baa5-9eb7-414f-b18a-c49217ace9b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=0ec2c49b-401e-4ba2-8344-3d943b18845b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.287 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 0ec2c49b-401e-4ba2-8344-3d943b18845b in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.288 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.295 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c35ee59a-b6c7-4255-9ea4-7c102d8016b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.296 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.300 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.300 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[914badcb-048f-43a9-84f0-c88add7684cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.301 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c41a4de2-7d71-4c4e-b5b9-a0132905e4f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 systemd-machined[155419]: New machine qemu-14-instance-00000016.
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.312 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca7c676-7ba9-42a8-af38-b955efc1afa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 systemd[1]: Started Virtual Machine qemu-14-instance-00000016.
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.324 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[37d4bf51-f58d-4844-b6a8-0994ca7d4bb9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 systemd-udevd[214372]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:46:55 compute-1 NetworkManager[56388]: <info>  [1771249615.3449] device (tap0ec2c49b-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:46:55 compute-1 NetworkManager[56388]: <info>  [1771249615.3459] device (tap0ec2c49b-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.352 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[0367cc0d-0f6c-4115-a681-45f07262f7a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.359 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[df9d4619-32ad-4621-b341-6e90663981c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 NetworkManager[56388]: <info>  [1771249615.3608] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.385 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[e2df35ab-b6e9-4132-95b5-71a2b97db045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.389 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[5a8e8f13-1d06-4fe9-85ab-30e31cc3bb29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 NetworkManager[56388]: <info>  [1771249615.4119] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.416 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2e321b-5123-45be-a009-c090b5206b33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.433 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8187e5-62ee-45bd-b526-e3998ca05500]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569983, 'reachable_time': 32956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214402, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.444 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[7e3aa84a-b218-4b38-9f8c-35268e63508a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 569983, 'tstamp': 569983}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214403, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.461 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb7cf2d-17e2-4f5a-a010-6f330a8bd7d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569983, 'reachable_time': 32956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214404, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.490 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce1c972-4748-4e2c-b9a8-0022c2d7e875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.540 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[70af4891-e4b1-45ab-b8de-926900513722]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.542 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.542 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.543 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:55 compute-1 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:46:55 compute-1 NetworkManager[56388]: <info>  [1771249615.5458] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.547 185914 DEBUG nova.compute.manager [req-e7af02d6-731b-4b8c-9c15-7a8253602fbb req-f6dfb331-32fd-464e-ac20-1f97198e9d1f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.547 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.547 185914 DEBUG oslo_concurrency.lockutils [req-e7af02d6-731b-4b8c-9c15-7a8253602fbb req-f6dfb331-32fd-464e-ac20-1f97198e9d1f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.548 185914 DEBUG oslo_concurrency.lockutils [req-e7af02d6-731b-4b8c-9c15-7a8253602fbb req-f6dfb331-32fd-464e-ac20-1f97198e9d1f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.548 185914 DEBUG oslo_concurrency.lockutils [req-e7af02d6-731b-4b8c-9c15-7a8253602fbb req-f6dfb331-32fd-464e-ac20-1f97198e9d1f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.548 185914 DEBUG nova.compute.manager [req-e7af02d6-731b-4b8c-9c15-7a8253602fbb req-f6dfb331-32fd-464e-ac20-1f97198e9d1f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Processing event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.549 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:55 compute-1 ovn_controller[96285]: 2026-02-16T13:46:55Z|00160|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.549 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.553 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.554 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2f5c17-39cf-44e2-9d72-486a9909b982]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.555 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:46:55 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:46:55.555 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.768 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249615.7677577, 3217baa5-9eb7-414f-b18a-c49217ace9b6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.768 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] VM Started (Lifecycle Event)
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.771 185914 DEBUG nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.780 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.783 185914 INFO nova.virt.libvirt.driver [-] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Instance spawned successfully.
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.784 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.796 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.799 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.818 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.818 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.819 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.819 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.819 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.819 185914 DEBUG nova.virt.libvirt.driver [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.823 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.823 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249615.7679043, 3217baa5-9eb7-414f-b18a-c49217ace9b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.824 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] VM Paused (Lifecycle Event)
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.866 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.870 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249615.7791276, 3217baa5-9eb7-414f-b18a-c49217ace9b6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.870 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] VM Resumed (Lifecycle Event)
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.906 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.910 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:46:55 compute-1 podman[214443]: 2026-02-16 13:46:55.927238414 +0000 UTC m=+0.043716864 container create ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.929 185914 INFO nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Took 7.58 seconds to spawn the instance on the hypervisor.
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.929 185914 DEBUG nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:46:55 compute-1 nova_compute[185910]: 2026-02-16 13:46:55.936 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:46:55 compute-1 systemd[1]: Started libpod-conmon-ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03.scope.
Feb 16 13:46:55 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:46:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bb8847daf17ccdc74354b43e4073fdf6c09fb2773c00a89e259f24b99ae2e79/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:46:55 compute-1 podman[214443]: 2026-02-16 13:46:55.998494492 +0000 UTC m=+0.114972952 container init ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:46:56 compute-1 podman[214443]: 2026-02-16 13:46:55.903646936 +0000 UTC m=+0.020125386 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:46:56 compute-1 podman[214443]: 2026-02-16 13:46:56.004250078 +0000 UTC m=+0.120728508 container start ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 16 13:46:56 compute-1 nova_compute[185910]: 2026-02-16 13:46:56.008 185914 INFO nova.compute.manager [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Took 8.23 seconds to build instance.
Feb 16 13:46:56 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214458]: [NOTICE]   (214462) : New worker (214464) forked
Feb 16 13:46:56 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214458]: [NOTICE]   (214462) : Loading success.
Feb 16 13:46:56 compute-1 nova_compute[185910]: 2026-02-16 13:46:56.029 185914 DEBUG oslo_concurrency.lockutils [None req-a2ec6a8a-b18f-4205-b581-d109cc32b7ab e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:56 compute-1 nova_compute[185910]: 2026-02-16 13:46:56.075 185914 DEBUG nova.network.neutron [req-cb2e40c5-1ca9-486c-9729-e534bb339866 req-4c371d37-723d-4b53-a0f1-e63a4b097012 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updated VIF entry in instance network info cache for port 0ec2c49b-401e-4ba2-8344-3d943b18845b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:46:56 compute-1 nova_compute[185910]: 2026-02-16 13:46:56.075 185914 DEBUG nova.network.neutron [req-cb2e40c5-1ca9-486c-9729-e534bb339866 req-4c371d37-723d-4b53-a0f1-e63a4b097012 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updating instance_info_cache with network_info: [{"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:46:56 compute-1 nova_compute[185910]: 2026-02-16 13:46:56.094 185914 DEBUG oslo_concurrency.lockutils [req-cb2e40c5-1ca9-486c-9729-e534bb339866 req-4c371d37-723d-4b53-a0f1-e63a4b097012 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:46:58 compute-1 nova_compute[185910]: 2026-02-16 13:46:58.019 185914 DEBUG nova.compute.manager [req-e1923b3a-e114-4f94-ae14-2f7a79aa348f req-d3d8979d-0e79-4083-8768-fa64373e8be5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:46:58 compute-1 nova_compute[185910]: 2026-02-16 13:46:58.019 185914 DEBUG oslo_concurrency.lockutils [req-e1923b3a-e114-4f94-ae14-2f7a79aa348f req-d3d8979d-0e79-4083-8768-fa64373e8be5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:46:58 compute-1 nova_compute[185910]: 2026-02-16 13:46:58.020 185914 DEBUG oslo_concurrency.lockutils [req-e1923b3a-e114-4f94-ae14-2f7a79aa348f req-d3d8979d-0e79-4083-8768-fa64373e8be5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:46:58 compute-1 nova_compute[185910]: 2026-02-16 13:46:58.020 185914 DEBUG oslo_concurrency.lockutils [req-e1923b3a-e114-4f94-ae14-2f7a79aa348f req-d3d8979d-0e79-4083-8768-fa64373e8be5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:46:58 compute-1 nova_compute[185910]: 2026-02-16 13:46:58.021 185914 DEBUG nova.compute.manager [req-e1923b3a-e114-4f94-ae14-2f7a79aa348f req-d3d8979d-0e79-4083-8768-fa64373e8be5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] No waiting events found dispatching network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:46:58 compute-1 nova_compute[185910]: 2026-02-16 13:46:58.021 185914 WARNING nova.compute.manager [req-e1923b3a-e114-4f94-ae14-2f7a79aa348f req-d3d8979d-0e79-4083-8768-fa64373e8be5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received unexpected event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b for instance with vm_state active and task_state None.
Feb 16 13:46:59 compute-1 nova_compute[185910]: 2026-02-16 13:46:59.254 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:46:59 compute-1 nova_compute[185910]: 2026-02-16 13:46:59.633 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:03.357 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:03.358 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:03.358 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:04 compute-1 nova_compute[185910]: 2026-02-16 13:47:04.258 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:04 compute-1 nova_compute[185910]: 2026-02-16 13:47:04.635 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:04 compute-1 podman[214474]: 2026-02-16 13:47:04.931237625 +0000 UTC m=+0.060810647 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:47:04 compute-1 podman[214473]: 2026-02-16 13:47:04.938313327 +0000 UTC m=+0.068434513 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-type=git, release=1770267347, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 16 13:47:05 compute-1 nova_compute[185910]: 2026-02-16 13:47:05.621 185914 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Check if temp file /var/lib/nova/instances/tmpu_jf1msv exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:47:05 compute-1 nova_compute[185910]: 2026-02-16 13:47:05.623 185914 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu_jf1msv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3217baa5-9eb7-414f-b18a-c49217ace9b6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:47:05 compute-1 podman[195236]: time="2026-02-16T13:47:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:47:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:47:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:47:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:47:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 16 13:47:07 compute-1 nova_compute[185910]: 2026-02-16 13:47:07.997 185914 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:08 compute-1 nova_compute[185910]: 2026-02-16 13:47:08.051 185914 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:08 compute-1 nova_compute[185910]: 2026-02-16 13:47:08.053 185914 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:47:08 compute-1 nova_compute[185910]: 2026-02-16 13:47:08.130 185914 DEBUG oslo_concurrency.processutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:47:08 compute-1 ovn_controller[96285]: 2026-02-16T13:47:08Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:34:09:3f 10.100.0.4
Feb 16 13:47:08 compute-1 ovn_controller[96285]: 2026-02-16T13:47:08Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:34:09:3f 10.100.0.4
Feb 16 13:47:09 compute-1 nova_compute[185910]: 2026-02-16 13:47:09.263 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:09 compute-1 nova_compute[185910]: 2026-02-16 13:47:09.638 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:12 compute-1 podman[214534]: 2026-02-16 13:47:12.961601558 +0000 UTC m=+0.104240742 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:47:13 compute-1 sshd-session[214533]: Invalid user test from 146.190.226.24 port 47370
Feb 16 13:47:13 compute-1 sshd-session[214533]: Connection closed by invalid user test 146.190.226.24 port 47370 [preauth]
Feb 16 13:47:13 compute-1 sshd-session[214562]: Accepted publickey for nova from 192.168.122.100 port 37364 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:47:13 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:47:13 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:47:13 compute-1 systemd-logind[821]: New session 44 of user nova.
Feb 16 13:47:13 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:47:13 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:47:13 compute-1 systemd[214566]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:47:14 compute-1 systemd[214566]: Queued start job for default target Main User Target.
Feb 16 13:47:14 compute-1 systemd[214566]: Created slice User Application Slice.
Feb 16 13:47:14 compute-1 systemd[214566]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:47:14 compute-1 systemd[214566]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:47:14 compute-1 systemd[214566]: Reached target Paths.
Feb 16 13:47:14 compute-1 systemd[214566]: Reached target Timers.
Feb 16 13:47:14 compute-1 systemd[214566]: Starting D-Bus User Message Bus Socket...
Feb 16 13:47:14 compute-1 systemd[214566]: Starting Create User's Volatile Files and Directories...
Feb 16 13:47:14 compute-1 systemd[214566]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:47:14 compute-1 systemd[214566]: Reached target Sockets.
Feb 16 13:47:14 compute-1 systemd[214566]: Finished Create User's Volatile Files and Directories.
Feb 16 13:47:14 compute-1 systemd[214566]: Reached target Basic System.
Feb 16 13:47:14 compute-1 systemd[214566]: Reached target Main User Target.
Feb 16 13:47:14 compute-1 systemd[214566]: Startup finished in 101ms.
Feb 16 13:47:14 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:47:14 compute-1 systemd[1]: Started Session 44 of User nova.
Feb 16 13:47:14 compute-1 sshd-session[214562]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:47:14 compute-1 sshd-session[214581]: Received disconnect from 192.168.122.100 port 37364:11: disconnected by user
Feb 16 13:47:14 compute-1 sshd-session[214581]: Disconnected from user nova 192.168.122.100 port 37364
Feb 16 13:47:14 compute-1 sshd-session[214562]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:47:14 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Feb 16 13:47:14 compute-1 systemd-logind[821]: Session 44 logged out. Waiting for processes to exit.
Feb 16 13:47:14 compute-1 systemd-logind[821]: Removed session 44.
Feb 16 13:47:14 compute-1 nova_compute[185910]: 2026-02-16 13:47:14.266 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:14 compute-1 nova_compute[185910]: 2026-02-16 13:47:14.640 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.239 185914 DEBUG nova.compute.manager [req-cf0db826-a0cb-41c8-9956-c426c0c1dc79 req-428a2707-f859-49c1-b488-cfd2c464c0ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-unplugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.240 185914 DEBUG oslo_concurrency.lockutils [req-cf0db826-a0cb-41c8-9956-c426c0c1dc79 req-428a2707-f859-49c1-b488-cfd2c464c0ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.240 185914 DEBUG oslo_concurrency.lockutils [req-cf0db826-a0cb-41c8-9956-c426c0c1dc79 req-428a2707-f859-49c1-b488-cfd2c464c0ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.240 185914 DEBUG oslo_concurrency.lockutils [req-cf0db826-a0cb-41c8-9956-c426c0c1dc79 req-428a2707-f859-49c1-b488-cfd2c464c0ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.240 185914 DEBUG nova.compute.manager [req-cf0db826-a0cb-41c8-9956-c426c0c1dc79 req-428a2707-f859-49c1-b488-cfd2c464c0ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] No waiting events found dispatching network-vif-unplugged-0ec2c49b-401e-4ba2-8344-3d943b18845b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.240 185914 DEBUG nova.compute.manager [req-cf0db826-a0cb-41c8-9956-c426c0c1dc79 req-428a2707-f859-49c1-b488-cfd2c464c0ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-unplugged-0ec2c49b-401e-4ba2-8344-3d943b18845b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.925 185914 INFO nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Took 7.79 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.925 185914 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.943 185914 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu_jf1msv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3217baa5-9eb7-414f-b18a-c49217ace9b6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(96cc26af-369a-42a4-acdf-7e760a2b25f9),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.969 185914 DEBUG nova.objects.instance [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lazy-loading 'migration_context' on Instance uuid 3217baa5-9eb7-414f-b18a-c49217ace9b6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.971 185914 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.973 185914 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.973 185914 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.994 185914 DEBUG nova.virt.libvirt.vif [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:46:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1517311993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1517311993',id=22,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:46:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4pvdm8hl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:46:55Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3217baa5-9eb7-414f-b18a-c49217ace9b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.994 185914 DEBUG nova.network.os_vif_util [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converting VIF {"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.995 185914 DEBUG nova.network.os_vif_util [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.995 185914 DEBUG nova.virt.libvirt.migration [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:47:15 compute-1 nova_compute[185910]:   <mac address="fa:16:3e:34:09:3f"/>
Feb 16 13:47:15 compute-1 nova_compute[185910]:   <model type="virtio"/>
Feb 16 13:47:15 compute-1 nova_compute[185910]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:47:15 compute-1 nova_compute[185910]:   <mtu size="1442"/>
Feb 16 13:47:15 compute-1 nova_compute[185910]:   <target dev="tap0ec2c49b-40"/>
Feb 16 13:47:15 compute-1 nova_compute[185910]: </interface>
Feb 16 13:47:15 compute-1 nova_compute[185910]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:47:15 compute-1 nova_compute[185910]: 2026-02-16 13:47:15.996 185914 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:47:16 compute-1 nova_compute[185910]: 2026-02-16 13:47:16.476 185914 DEBUG nova.virt.libvirt.migration [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:47:16 compute-1 nova_compute[185910]: 2026-02-16 13:47:16.477 185914 INFO nova.virt.libvirt.migration [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:47:16 compute-1 nova_compute[185910]: 2026-02-16 13:47:16.572 185914 INFO nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.075 185914 DEBUG nova.virt.libvirt.migration [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.075 185914 DEBUG nova.virt.libvirt.migration [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.579 185914 DEBUG nova.virt.libvirt.migration [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.580 185914 DEBUG nova.virt.libvirt.migration [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.621 185914 DEBUG nova.compute.manager [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.622 185914 DEBUG oslo_concurrency.lockutils [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.622 185914 DEBUG oslo_concurrency.lockutils [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.623 185914 DEBUG oslo_concurrency.lockutils [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.623 185914 DEBUG nova.compute.manager [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] No waiting events found dispatching network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.624 185914 WARNING nova.compute.manager [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received unexpected event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b for instance with vm_state active and task_state migrating.
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.624 185914 DEBUG nova.compute.manager [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-changed-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.624 185914 DEBUG nova.compute.manager [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Refreshing instance network info cache due to event network-changed-0ec2c49b-401e-4ba2-8344-3d943b18845b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.624 185914 DEBUG oslo_concurrency.lockutils [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.625 185914 DEBUG oslo_concurrency.lockutils [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:47:17 compute-1 nova_compute[185910]: 2026-02-16 13:47:17.625 185914 DEBUG nova.network.neutron [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Refreshing network info cache for port 0ec2c49b-401e-4ba2-8344-3d943b18845b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.086 185914 DEBUG nova.virt.libvirt.migration [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.087 185914 DEBUG nova.virt.libvirt.migration [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.252 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249638.2511017, 3217baa5-9eb7-414f-b18a-c49217ace9b6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.252 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] VM Paused (Lifecycle Event)
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.299 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.306 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.334 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:47:18 compute-1 kernel: tap0ec2c49b-40 (unregistering): left promiscuous mode
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.393 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:18 compute-1 NetworkManager[56388]: <info>  [1771249638.3948] device (tap0ec2c49b-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:47:18 compute-1 ovn_controller[96285]: 2026-02-16T13:47:18Z|00161|binding|INFO|Releasing lport 0ec2c49b-401e-4ba2-8344-3d943b18845b from this chassis (sb_readonly=0)
Feb 16 13:47:18 compute-1 ovn_controller[96285]: 2026-02-16T13:47:18Z|00162|binding|INFO|Setting lport 0ec2c49b-401e-4ba2-8344-3d943b18845b down in Southbound
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.400 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:18 compute-1 ovn_controller[96285]: 2026-02-16T13:47:18Z|00163|binding|INFO|Removing iface tap0ec2c49b-40 ovn-installed in OVS
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.402 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.407 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.411 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:09:3f 10.100.0.4'], port_security=['fa:16:3e:34:09:3f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '3217baa5-9eb7-414f-b18a-c49217ace9b6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=0ec2c49b-401e-4ba2-8344-3d943b18845b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.413 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 0ec2c49b-401e-4ba2-8344-3d943b18845b in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.414 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.415 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[2b3ce4a6-fdd3-4706-9788-4b6cf6b07ce8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.416 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:47:18 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000016.scope: Deactivated successfully.
Feb 16 13:47:18 compute-1 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000016.scope: Consumed 13.203s CPU time.
Feb 16 13:47:18 compute-1 systemd-machined[155419]: Machine qemu-14-instance-00000016 terminated.
Feb 16 13:47:18 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214458]: [NOTICE]   (214462) : haproxy version is 2.8.14-c23fe91
Feb 16 13:47:18 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214458]: [NOTICE]   (214462) : path to executable is /usr/sbin/haproxy
Feb 16 13:47:18 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214458]: [WARNING]  (214462) : Exiting Master process...
Feb 16 13:47:18 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214458]: [ALERT]    (214462) : Current worker (214464) exited with code 143 (Terminated)
Feb 16 13:47:18 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[214458]: [WARNING]  (214462) : All workers exited. Exiting... (0)
Feb 16 13:47:18 compute-1 systemd[1]: libpod-ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03.scope: Deactivated successfully.
Feb 16 13:47:18 compute-1 podman[214613]: 2026-02-16 13:47:18.548895807 +0000 UTC m=+0.044499415 container died ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:47:18 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03-userdata-shm.mount: Deactivated successfully.
Feb 16 13:47:18 compute-1 systemd[1]: var-lib-containers-storage-overlay-3bb8847daf17ccdc74354b43e4073fdf6c09fb2773c00a89e259f24b99ae2e79-merged.mount: Deactivated successfully.
Feb 16 13:47:18 compute-1 podman[214613]: 2026-02-16 13:47:18.590959915 +0000 UTC m=+0.086563493 container cleanup ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:47:18 compute-1 systemd[1]: libpod-conmon-ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03.scope: Deactivated successfully.
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.616 185914 DEBUG nova.virt.libvirt.guest [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.618 185914 INFO nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Migration operation has completed
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.619 185914 INFO nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] _post_live_migration() is started..
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.626 185914 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.627 185914 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.627 185914 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:47:18 compute-1 podman[214651]: 2026-02-16 13:47:18.657824105 +0000 UTC m=+0.044114275 container remove ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.661 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd7d45f-060b-4585-a4aa-456105f7a081]: (4, ('Mon Feb 16 01:47:18 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03)\nac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03\nMon Feb 16 01:47:18 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (ac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03)\nac75a4e0a62df4ef1f09e57d136cd2825b588798f747a2f0aa4e7d3143228d03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.663 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[18268404-f7f4-4ad1-b71a-9052849eac69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.665 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.708 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:18 compute-1 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:47:18 compute-1 nova_compute[185910]: 2026-02-16 13:47:18.715 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.718 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8770c175-a3c8-499a-ab0a-23fc1f52efbf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.742 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[80958310-464a-44f8-b6fe-bd35e5bf3403]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.744 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9004fa-3e15-449a-8aa3-25339b2d991a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.756 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e83efeb5-588f-41a7-aab9-09cd38e9d550]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 569976, 'reachable_time': 42483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214676, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:18 compute-1 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.759 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:47:18 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:18.759 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[d8dae16c-32ef-4db3-a849-0c423cf590a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:47:18 compute-1 sshd-session[214584]: Invalid user postgres from 188.166.42.159 port 47370
Feb 16 13:47:19 compute-1 sshd-session[214584]: Connection closed by invalid user postgres 188.166.42.159 port 47370 [preauth]
Feb 16 13:47:19 compute-1 nova_compute[185910]: 2026-02-16 13:47:19.270 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:19 compute-1 nova_compute[185910]: 2026-02-16 13:47:19.296 185914 DEBUG nova.compute.manager [req-9303afbc-9799-4039-862c-7d9b31266a03 req-7a7342be-9c7d-4c20-b065-b792e7f13749 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-unplugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:19 compute-1 nova_compute[185910]: 2026-02-16 13:47:19.297 185914 DEBUG oslo_concurrency.lockutils [req-9303afbc-9799-4039-862c-7d9b31266a03 req-7a7342be-9c7d-4c20-b065-b792e7f13749 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:19 compute-1 nova_compute[185910]: 2026-02-16 13:47:19.298 185914 DEBUG oslo_concurrency.lockutils [req-9303afbc-9799-4039-862c-7d9b31266a03 req-7a7342be-9c7d-4c20-b065-b792e7f13749 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:19 compute-1 nova_compute[185910]: 2026-02-16 13:47:19.298 185914 DEBUG oslo_concurrency.lockutils [req-9303afbc-9799-4039-862c-7d9b31266a03 req-7a7342be-9c7d-4c20-b065-b792e7f13749 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:19 compute-1 nova_compute[185910]: 2026-02-16 13:47:19.299 185914 DEBUG nova.compute.manager [req-9303afbc-9799-4039-862c-7d9b31266a03 req-7a7342be-9c7d-4c20-b065-b792e7f13749 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] No waiting events found dispatching network-vif-unplugged-0ec2c49b-401e-4ba2-8344-3d943b18845b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:19 compute-1 nova_compute[185910]: 2026-02-16 13:47:19.299 185914 DEBUG nova.compute.manager [req-9303afbc-9799-4039-862c-7d9b31266a03 req-7a7342be-9c7d-4c20-b065-b792e7f13749 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-unplugged-0ec2c49b-401e-4ba2-8344-3d943b18845b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:47:19 compute-1 openstack_network_exporter[198096]: ERROR   13:47:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:47:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:47:19 compute-1 openstack_network_exporter[198096]: ERROR   13:47:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:47:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:47:19 compute-1 nova_compute[185910]: 2026-02-16 13:47:19.642 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.425 185914 DEBUG nova.network.neutron [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Activated binding for port 0ec2c49b-401e-4ba2-8344-3d943b18845b and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.426 185914 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.427 185914 DEBUG nova.virt.libvirt.vif [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:46:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1517311993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1517311993',id=22,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:46:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-4pvdm8hl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:47:02Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=3217baa5-9eb7-414f-b18a-c49217ace9b6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.427 185914 DEBUG nova.network.os_vif_util [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converting VIF {"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.428 185914 DEBUG nova.network.os_vif_util [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.428 185914 DEBUG os_vif [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.430 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.430 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ec2c49b-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.432 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.435 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.437 185914 INFO os_vif [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:09:3f,bridge_name='br-int',has_traffic_filtering=True,id=0ec2c49b-401e-4ba2-8344-3d943b18845b,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ec2c49b-40')
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.438 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.438 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.438 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.438 185914 DEBUG nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.439 185914 INFO nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Deleting instance files /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6_del
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.439 185914 INFO nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Deletion of /var/lib/nova/instances/3217baa5-9eb7-414f-b18a-c49217ace9b6_del complete
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.888 185914 DEBUG nova.network.neutron [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updated VIF entry in instance network info cache for port 0ec2c49b-401e-4ba2-8344-3d943b18845b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.889 185914 DEBUG nova.network.neutron [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Updating instance_info_cache with network_info: [{"id": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "address": "fa:16:3e:34:09:3f", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ec2c49b-40", "ovs_interfaceid": "0ec2c49b-401e-4ba2-8344-3d943b18845b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:47:20 compute-1 nova_compute[185910]: 2026-02-16 13:47:20.921 185914 DEBUG oslo_concurrency.lockutils [req-c0f299ce-fa1b-4880-8f54-b0a5987e4b86 req-45683efb-300c-4ab7-90fd-86fa76630e82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3217baa5-9eb7-414f-b18a-c49217ace9b6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:47:20 compute-1 podman[214678]: 2026-02-16 13:47:20.936980972 +0000 UTC m=+0.080491570 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.460 185914 DEBUG nova.compute.manager [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.460 185914 DEBUG oslo_concurrency.lockutils [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.461 185914 DEBUG oslo_concurrency.lockutils [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.461 185914 DEBUG oslo_concurrency.lockutils [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.461 185914 DEBUG nova.compute.manager [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] No waiting events found dispatching network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.462 185914 WARNING nova.compute.manager [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received unexpected event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b for instance with vm_state active and task_state migrating.
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.462 185914 DEBUG nova.compute.manager [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.462 185914 DEBUG oslo_concurrency.lockutils [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.462 185914 DEBUG oslo_concurrency.lockutils [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.463 185914 DEBUG oslo_concurrency.lockutils [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.463 185914 DEBUG nova.compute.manager [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] No waiting events found dispatching network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.463 185914 WARNING nova.compute.manager [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received unexpected event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b for instance with vm_state active and task_state migrating.
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.463 185914 DEBUG nova.compute.manager [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.464 185914 DEBUG oslo_concurrency.lockutils [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.464 185914 DEBUG oslo_concurrency.lockutils [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.464 185914 DEBUG oslo_concurrency.lockutils [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.464 185914 DEBUG nova.compute.manager [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] No waiting events found dispatching network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:47:21 compute-1 nova_compute[185910]: 2026-02-16 13:47:21.464 185914 WARNING nova.compute.manager [req-db4f7337-e2a2-4791-bbb0-f9e411289e23 req-55807743-cb09-474d-99c1-fcc9dad73383 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Received unexpected event network-vif-plugged-0ec2c49b-401e-4ba2-8344-3d943b18845b for instance with vm_state active and task_state migrating.
Feb 16 13:47:23 compute-1 nova_compute[185910]: 2026-02-16 13:47:23.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:23 compute-1 nova_compute[185910]: 2026-02-16 13:47:23.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:23 compute-1 sshd-session[214677]: error: kex_exchange_identification: read: Connection reset by peer
Feb 16 13:47:23 compute-1 sshd-session[214677]: Connection reset by 103.236.154.142 port 59460
Feb 16 13:47:24 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:47:24 compute-1 systemd[214566]: Activating special unit Exit the Session...
Feb 16 13:47:24 compute-1 systemd[214566]: Stopped target Main User Target.
Feb 16 13:47:24 compute-1 systemd[214566]: Stopped target Basic System.
Feb 16 13:47:24 compute-1 systemd[214566]: Stopped target Paths.
Feb 16 13:47:24 compute-1 systemd[214566]: Stopped target Sockets.
Feb 16 13:47:24 compute-1 systemd[214566]: Stopped target Timers.
Feb 16 13:47:24 compute-1 systemd[214566]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:47:24 compute-1 systemd[214566]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:47:24 compute-1 systemd[214566]: Closed D-Bus User Message Bus Socket.
Feb 16 13:47:24 compute-1 systemd[214566]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:47:24 compute-1 systemd[214566]: Removed slice User Application Slice.
Feb 16 13:47:24 compute-1 systemd[214566]: Reached target Shutdown.
Feb 16 13:47:24 compute-1 systemd[214566]: Finished Exit the Session.
Feb 16 13:47:24 compute-1 systemd[214566]: Reached target Exit the Session.
Feb 16 13:47:24 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:47:24 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:47:24 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:47:24 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:47:24 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:47:24 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:47:24 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:47:24 compute-1 nova_compute[185910]: 2026-02-16 13:47:24.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:24 compute-1 nova_compute[185910]: 2026-02-16 13:47:24.645 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:25 compute-1 ovn_controller[96285]: 2026-02-16T13:47:25Z|00164|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Feb 16 13:47:25 compute-1 nova_compute[185910]: 2026-02-16 13:47:25.433 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:25 compute-1 nova_compute[185910]: 2026-02-16 13:47:25.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.526 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.527 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.527 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "3217baa5-9eb7-414f-b18a-c49217ace9b6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.560 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.560 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.561 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.561 185914 DEBUG nova.compute.resource_tracker [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.682 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.683 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.683 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.683 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.795 185914 WARNING nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.796 185914 DEBUG nova.compute.resource_tracker [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5775MB free_disk=73.22323608398438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.797 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.797 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.847 185914 DEBUG nova.compute.resource_tracker [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Migration for instance 3217baa5-9eb7-414f-b18a-c49217ace9b6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.877 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.878 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5775MB free_disk=73.22323608398438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.879 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.883 185914 DEBUG nova.compute.resource_tracker [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.918 185914 DEBUG nova.compute.resource_tracker [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Migration 96cc26af-369a-42a4-acdf-7e760a2b25f9 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.919 185914 DEBUG nova.compute.resource_tracker [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.919 185914 DEBUG nova.compute.resource_tracker [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.970 185914 DEBUG nova.compute.provider_tree [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:47:27 compute-1 nova_compute[185910]: 2026-02-16 13:47:27.990 185914 DEBUG nova.scheduler.client.report [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.020 185914 DEBUG nova.compute.resource_tracker [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.021 185914 DEBUG oslo_concurrency.lockutils [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.024 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.027 185914 INFO nova.compute.manager [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.127 185914 INFO nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 96cc26af-369a-42a4-acdf-7e760a2b25f9 has allocations against this compute host but is not found in the database.
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.127 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.128 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.170 185914 INFO nova.scheduler.client.report [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Deleted allocation for migration 96cc26af-369a-42a4-acdf-7e760a2b25f9
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.171 185914 DEBUG nova.virt.libvirt.driver [None req-1a7029b2-abe9-4613-bea5-b54bc2c81ba8 d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.173 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.189 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.192 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:47:28 compute-1 nova_compute[185910]: 2026-02-16 13:47:28.192 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:47:29 compute-1 nova_compute[185910]: 2026-02-16 13:47:29.189 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:29 compute-1 nova_compute[185910]: 2026-02-16 13:47:29.190 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:29 compute-1 nova_compute[185910]: 2026-02-16 13:47:29.190 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:47:29 compute-1 nova_compute[185910]: 2026-02-16 13:47:29.191 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:47:29 compute-1 nova_compute[185910]: 2026-02-16 13:47:29.216 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:47:29 compute-1 nova_compute[185910]: 2026-02-16 13:47:29.646 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:30 compute-1 nova_compute[185910]: 2026-02-16 13:47:30.436 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:33 compute-1 nova_compute[185910]: 2026-02-16 13:47:33.618 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249638.6158805, 3217baa5-9eb7-414f-b18a-c49217ace9b6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:47:33 compute-1 nova_compute[185910]: 2026-02-16 13:47:33.618 185914 INFO nova.compute.manager [-] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] VM Stopped (Lifecycle Event)
Feb 16 13:47:33 compute-1 nova_compute[185910]: 2026-02-16 13:47:33.661 185914 DEBUG nova.compute.manager [None req-1e0bc295-6bf5-4f84-9cec-7ca596046667 - - - - - -] [instance: 3217baa5-9eb7-414f-b18a-c49217ace9b6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:47:34 compute-1 nova_compute[185910]: 2026-02-16 13:47:34.648 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:35 compute-1 nova_compute[185910]: 2026-02-16 13:47:35.439 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:35 compute-1 podman[195236]: time="2026-02-16T13:47:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:47:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:47:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:47:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:47:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Feb 16 13:47:35 compute-1 podman[214707]: 2026-02-16 13:47:35.924979968 +0000 UTC m=+0.055626597 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, release=1770267347, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=9.7, io.buildah.version=1.33.7, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 13:47:35 compute-1 podman[214708]: 2026-02-16 13:47:35.944985529 +0000 UTC m=+0.072507103 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 13:47:36 compute-1 nova_compute[185910]: 2026-02-16 13:47:36.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:36 compute-1 nova_compute[185910]: 2026-02-16 13:47:36.870 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:47:36 compute-1 nova_compute[185910]: 2026-02-16 13:47:36.870 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:47:37 compute-1 sshd-session[214704]: Invalid user a from 103.236.154.142 port 42872
Feb 16 13:47:37 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:37.775 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:47:37 compute-1 nova_compute[185910]: 2026-02-16 13:47:37.776 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:37 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:37.776 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:47:39 compute-1 nova_compute[185910]: 2026-02-16 13:47:39.650 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:40 compute-1 sshd-session[214704]: Connection closed by invalid user a 103.236.154.142 port 42872 [preauth]
Feb 16 13:47:40 compute-1 nova_compute[185910]: 2026-02-16 13:47:40.442 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:40 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:47:40.779 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:47:43 compute-1 podman[214748]: 2026-02-16 13:47:43.995287631 +0000 UTC m=+0.120288016 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:47:44 compute-1 nova_compute[185910]: 2026-02-16 13:47:44.653 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:45 compute-1 nova_compute[185910]: 2026-02-16 13:47:45.444 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:49 compute-1 openstack_network_exporter[198096]: ERROR   13:47:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:47:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:47:49 compute-1 openstack_network_exporter[198096]: ERROR   13:47:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:47:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:47:49 compute-1 nova_compute[185910]: 2026-02-16 13:47:49.656 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:50 compute-1 nova_compute[185910]: 2026-02-16 13:47:50.448 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:51 compute-1 podman[214774]: 2026-02-16 13:47:51.940292514 +0000 UTC m=+0.076808689 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:47:54 compute-1 nova_compute[185910]: 2026-02-16 13:47:54.658 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:55 compute-1 nova_compute[185910]: 2026-02-16 13:47:55.504 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:47:59 compute-1 nova_compute[185910]: 2026-02-16 13:47:59.658 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:00 compute-1 nova_compute[185910]: 2026-02-16 13:48:00.507 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:03.359 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:03.360 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:03.360 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:04 compute-1 nova_compute[185910]: 2026-02-16 13:48:04.660 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:05 compute-1 nova_compute[185910]: 2026-02-16 13:48:05.550 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:05 compute-1 podman[195236]: time="2026-02-16T13:48:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:48:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:48:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:48:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:48:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:48:06 compute-1 podman[214800]: 2026-02-16 13:48:06.906793878 +0000 UTC m=+0.045443501 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Feb 16 13:48:06 compute-1 podman[214799]: 2026-02-16 13:48:06.912943674 +0000 UTC m=+0.054292960 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:48:09 compute-1 nova_compute[185910]: 2026-02-16 13:48:09.663 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:10 compute-1 nova_compute[185910]: 2026-02-16 13:48:10.594 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:12 compute-1 sshd-session[214840]: Invalid user postgres from 188.166.42.159 port 33054
Feb 16 13:48:12 compute-1 sshd-session[214840]: Connection closed by invalid user postgres 188.166.42.159 port 33054 [preauth]
Feb 16 13:48:12 compute-1 ovn_controller[96285]: 2026-02-16T13:48:12Z|00165|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Feb 16 13:48:14 compute-1 nova_compute[185910]: 2026-02-16 13:48:14.666 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:15 compute-1 podman[214842]: 2026-02-16 13:48:15.009002266 +0000 UTC m=+0.144367438 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:48:15 compute-1 nova_compute[185910]: 2026-02-16 13:48:15.597 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:19 compute-1 sshd-session[214868]: Connection closed by authenticating user root 2.57.122.210 port 43334 [preauth]
Feb 16 13:48:19 compute-1 openstack_network_exporter[198096]: ERROR   13:48:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:48:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:48:19 compute-1 openstack_network_exporter[198096]: ERROR   13:48:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:48:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:48:19 compute-1 nova_compute[185910]: 2026-02-16 13:48:19.669 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:20 compute-1 nova_compute[185910]: 2026-02-16 13:48:20.600 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:21 compute-1 sshd-session[214870]: Invalid user test from 146.190.226.24 port 45628
Feb 16 13:48:22 compute-1 sshd-session[214870]: Connection closed by invalid user test 146.190.226.24 port 45628 [preauth]
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.288 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.289 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.307 185914 DEBUG nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.400 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.401 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.412 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.413 185914 INFO nova.compute.claims [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.652 185914 DEBUG nova.compute.provider_tree [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.673 185914 DEBUG nova.scheduler.client.report [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.725 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.726 185914 DEBUG nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.787 185914 DEBUG nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.788 185914 DEBUG nova.network.neutron [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.810 185914 INFO nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.840 185914 DEBUG nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:48:22 compute-1 podman[214872]: 2026-02-16 13:48:22.914960461 +0000 UTC m=+0.060006125 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.940 185914 DEBUG nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.941 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.942 185914 INFO nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Creating image(s)
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.942 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.942 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.943 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:22 compute-1 nova_compute[185910]: 2026-02-16 13:48:22.956 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.001 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.002 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.003 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.019 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.096 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.097 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.134 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.135 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.135 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.196 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.198 185914 DEBUG nova.virt.disk.api [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Checking if we can resize image /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.199 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.258 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.259 185914 DEBUG nova.virt.disk.api [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Cannot resize image /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.259 185914 DEBUG nova.objects.instance [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'migration_context' on Instance uuid 4433d998-a1da-44d3-ae35-b75895398b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.276 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.277 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Ensure instance console log exists: /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.278 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.278 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.278 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:23 compute-1 nova_compute[185910]: 2026-02-16 13:48:23.959 185914 DEBUG nova.policy [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e19cd2d8a8894526ba620ca3249e9a63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:48:24 compute-1 nova_compute[185910]: 2026-02-16 13:48:24.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:24 compute-1 nova_compute[185910]: 2026-02-16 13:48:24.670 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:25 compute-1 nova_compute[185910]: 2026-02-16 13:48:25.268 185914 DEBUG nova.network.neutron [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Successfully created port: 4005b3ce-3d4d-4741-91d2-940ee880a617 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:48:25 compute-1 nova_compute[185910]: 2026-02-16 13:48:25.603 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:25 compute-1 nova_compute[185910]: 2026-02-16 13:48:25.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:26 compute-1 nova_compute[185910]: 2026-02-16 13:48:26.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:27 compute-1 nova_compute[185910]: 2026-02-16 13:48:27.116 185914 DEBUG nova.network.neutron [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Successfully updated port: 4005b3ce-3d4d-4741-91d2-940ee880a617 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:48:27 compute-1 nova_compute[185910]: 2026-02-16 13:48:27.146 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:48:27 compute-1 nova_compute[185910]: 2026-02-16 13:48:27.146 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquired lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:48:27 compute-1 nova_compute[185910]: 2026-02-16 13:48:27.146 185914 DEBUG nova.network.neutron [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:48:27 compute-1 nova_compute[185910]: 2026-02-16 13:48:27.218 185914 DEBUG nova.compute.manager [req-5a5ffaad-debe-44c7-9c95-f948cacd7dc4 req-f19e647f-3c71-4d41-8ef1-0d3dcced5cc6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-changed-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:48:27 compute-1 nova_compute[185910]: 2026-02-16 13:48:27.218 185914 DEBUG nova.compute.manager [req-5a5ffaad-debe-44c7-9c95-f948cacd7dc4 req-f19e647f-3c71-4d41-8ef1-0d3dcced5cc6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Refreshing instance network info cache due to event network-changed-4005b3ce-3d4d-4741-91d2-940ee880a617. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:48:27 compute-1 nova_compute[185910]: 2026-02-16 13:48:27.218 185914 DEBUG oslo_concurrency.lockutils [req-5a5ffaad-debe-44c7-9c95-f948cacd7dc4 req-f19e647f-3c71-4d41-8ef1-0d3dcced5cc6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:48:27 compute-1 nova_compute[185910]: 2026-02-16 13:48:27.294 185914 DEBUG nova.network.neutron [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:48:27 compute-1 nova_compute[185910]: 2026-02-16 13:48:27.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.071 185914 DEBUG nova.network.neutron [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updating instance_info_cache with network_info: [{"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.105 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Releasing lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.105 185914 DEBUG nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Instance network_info: |[{"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.106 185914 DEBUG oslo_concurrency.lockutils [req-5a5ffaad-debe-44c7-9c95-f948cacd7dc4 req-f19e647f-3c71-4d41-8ef1-0d3dcced5cc6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.106 185914 DEBUG nova.network.neutron [req-5a5ffaad-debe-44c7-9c95-f948cacd7dc4 req-f19e647f-3c71-4d41-8ef1-0d3dcced5cc6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Refreshing network info cache for port 4005b3ce-3d4d-4741-91d2-940ee880a617 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.109 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Start _get_guest_xml network_info=[{"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.114 185914 WARNING nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.121 185914 DEBUG nova.virt.libvirt.host [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.122 185914 DEBUG nova.virt.libvirt.host [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.131 185914 DEBUG nova.virt.libvirt.host [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.132 185914 DEBUG nova.virt.libvirt.host [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.133 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.133 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.134 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.134 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.135 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.135 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.135 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.135 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.136 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.136 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.136 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.136 185914 DEBUG nova.virt.hardware [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.141 185914 DEBUG nova.virt.libvirt.vif [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-313355006',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-313355006',id=23,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-0v5zmrzs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:48:22Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=4433d998-a1da-44d3-ae35-b75895398b1f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.142 185914 DEBUG nova.network.os_vif_util [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.143 185914 DEBUG nova.network.os_vif_util [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.145 185914 DEBUG nova.objects.instance [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4433d998-a1da-44d3-ae35-b75895398b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.161 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <uuid>4433d998-a1da-44d3-ae35-b75895398b1f</uuid>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <name>instance-00000017</name>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteStrategies-server-313355006</nova:name>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:48:28</nova:creationTime>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:48:28 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:48:28 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:48:28 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:48:28 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:48:28 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:48:28 compute-1 nova_compute[185910]:         <nova:user uuid="e19cd2d8a8894526ba620ca3249e9a63">tempest-TestExecuteStrategies-1085993185-project-member</nova:user>
Feb 16 13:48:28 compute-1 nova_compute[185910]:         <nova:project uuid="76c271745e704d5fa97fe16a7dcd4a81">tempest-TestExecuteStrategies-1085993185</nova:project>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:48:28 compute-1 nova_compute[185910]:         <nova:port uuid="4005b3ce-3d4d-4741-91d2-940ee880a617">
Feb 16 13:48:28 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <system>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <entry name="serial">4433d998-a1da-44d3-ae35-b75895398b1f</entry>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <entry name="uuid">4433d998-a1da-44d3-ae35-b75895398b1f</entry>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     </system>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <os>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   </os>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <features>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   </features>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:1c:67:1b"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <target dev="tap4005b3ce-3d"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/console.log" append="off"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <video>
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     </video>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:48:28 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:48:28 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:48:28 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:48:28 compute-1 nova_compute[185910]: </domain>
Feb 16 13:48:28 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.162 185914 DEBUG nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Preparing to wait for external event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.162 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.163 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.163 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.164 185914 DEBUG nova.virt.libvirt.vif [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-313355006',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-313355006',id=23,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-0v5zmrzs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:48:22Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=4433d998-a1da-44d3-ae35-b75895398b1f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.164 185914 DEBUG nova.network.os_vif_util [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converting VIF {"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.165 185914 DEBUG nova.network.os_vif_util [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.166 185914 DEBUG os_vif [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.167 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.167 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.168 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.171 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.172 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4005b3ce-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.172 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4005b3ce-3d, col_values=(('external_ids', {'iface-id': '4005b3ce-3d4d-4741-91d2-940ee880a617', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:67:1b', 'vm-uuid': '4433d998-a1da-44d3-ae35-b75895398b1f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.175 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:28 compute-1 NetworkManager[56388]: <info>  [1771249708.1760] manager: (tap4005b3ce-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.177 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.184 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.185 185914 INFO os_vif [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d')
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.249 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.250 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.250 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] No VIF found with MAC fa:16:3e:1c:67:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.251 185914 INFO nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Using config drive
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.663 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.664 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.664 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.665 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.723 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.797 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.798 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.863 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.865 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000017, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config'
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.992 185914 INFO nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Creating config drive at /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config
Feb 16 13:48:28 compute-1 nova_compute[185910]: 2026-02-16 13:48:28.997 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpe857uny0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.012 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.015 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5809MB free_disk=73.22295761108398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.015 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.015 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.096 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 4433d998-a1da-44d3-ae35-b75895398b1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.096 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.096 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.116 185914 DEBUG oslo_concurrency.processutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpe857uny0" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:48:29 compute-1 kernel: tap4005b3ce-3d: entered promiscuous mode
Feb 16 13:48:29 compute-1 NetworkManager[56388]: <info>  [1771249709.1718] manager: (tap4005b3ce-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Feb 16 13:48:29 compute-1 ovn_controller[96285]: 2026-02-16T13:48:29Z|00166|binding|INFO|Claiming lport 4005b3ce-3d4d-4741-91d2-940ee880a617 for this chassis.
Feb 16 13:48:29 compute-1 ovn_controller[96285]: 2026-02-16T13:48:29Z|00167|binding|INFO|4005b3ce-3d4d-4741-91d2-940ee880a617: Claiming fa:16:3e:1c:67:1b 10.100.0.11
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.172 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:29 compute-1 ovn_controller[96285]: 2026-02-16T13:48:29Z|00168|binding|INFO|Setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 ovn-installed in OVS
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.178 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:48:29 compute-1 ovn_controller[96285]: 2026-02-16T13:48:29Z|00169|binding|INFO|Setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 up in Southbound
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.180 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:67:1b 10.100.0.11'], port_security=['fa:16:3e:1c:67:1b 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4433d998-a1da-44d3-ae35-b75895398b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '2', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=4005b3ce-3d4d-4741-91d2-940ee880a617) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.181 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 4005b3ce-3d4d-4741-91d2-940ee880a617 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 bound to our chassis
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.180 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.182 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.184 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.196 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5cb211d3-133d-44d7-b305-959f869a5e2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.198 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62a1ccdd-31 in ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.198 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.202 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62a1ccdd-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.202 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[199a144a-39b1-4396-a546-74d770f7d282]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.205 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8b56fb-a92d-477f-877d-58a7809d90fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 systemd-machined[155419]: New machine qemu-15-instance-00000017.
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.214 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0a98b2-bf82-408f-9013-6effe46e83d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.221 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.221 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:29 compute-1 systemd[1]: Started Virtual Machine qemu-15-instance-00000017.
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.238 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1602aa4b-1e76-4777-acd1-ce8b66d31734]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 systemd-udevd[214939]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:48:29 compute-1 NetworkManager[56388]: <info>  [1771249709.2504] device (tap4005b3ce-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:48:29 compute-1 NetworkManager[56388]: <info>  [1771249709.2513] device (tap4005b3ce-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.266 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[34efdac9-7d2f-4f62-ab8a-63dfb8d54512]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.272 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[81dd864e-1a5e-46c8-a4c2-6e70b328aa00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 NetworkManager[56388]: <info>  [1771249709.2751] manager: (tap62a1ccdd-30): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.296 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[2c091893-e6f1-4ab7-9a8f-31be70f289a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.299 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[048c062a-5580-410d-8e03-bbcde88a90b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 NetworkManager[56388]: <info>  [1771249709.3187] device (tap62a1ccdd-30): carrier: link connected
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.325 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[eee3f951-6542-45b7-bd88-5fe8adbcde6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.343 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[0896a3c6-2762-43e7-9a4f-5131e221aa8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 579373, 'reachable_time': 43649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214969, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.354 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8fc643-d04a-4164-b991-a42b724c6b5b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea9:9492'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 579373, 'tstamp': 579373}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214970, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.365 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[2d1b9bf4-fb88-412e-bc83-ccc774e21d79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62a1ccdd-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a9:94:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 579373, 'reachable_time': 43649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214971, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.384 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c3399c7b-5979-4569-ba74-76df1af1759e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.415 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[186ecd1a-b722-4848-85b0-2b06b8eddaa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.417 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.418 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.418 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62a1ccdd-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:29 compute-1 NetworkManager[56388]: <info>  [1771249709.4211] manager: (tap62a1ccdd-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Feb 16 13:48:29 compute-1 kernel: tap62a1ccdd-30: entered promiscuous mode
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.421 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.423 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62a1ccdd-30, col_values=(('external_ids', {'iface-id': 'ac21d57d-f71e-4560-b6aa-e9f6e3838308'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.424 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:29 compute-1 ovn_controller[96285]: 2026-02-16T13:48:29Z|00170|binding|INFO|Releasing lport ac21d57d-f71e-4560-b6aa-e9f6e3838308 from this chassis (sb_readonly=0)
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.425 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.426 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[d93b0aab-1f2d-43ec-b85b-e8963e646aaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.427 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.pid.haproxy
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 62a1ccdd-3048-4bbf-acc8-c791bff79ee8
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:48:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:29.428 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'env', 'PROCESS_TAG=haproxy-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62a1ccdd-3048-4bbf-acc8-c791bff79ee8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.429 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.458 185914 DEBUG nova.compute.manager [req-a981a4d7-5d6b-4d67-ae73-dadfbfe183b7 req-0639dcc9-202e-43ff-8302-966229e98ca4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.458 185914 DEBUG oslo_concurrency.lockutils [req-a981a4d7-5d6b-4d67-ae73-dadfbfe183b7 req-0639dcc9-202e-43ff-8302-966229e98ca4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.459 185914 DEBUG oslo_concurrency.lockutils [req-a981a4d7-5d6b-4d67-ae73-dadfbfe183b7 req-0639dcc9-202e-43ff-8302-966229e98ca4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.459 185914 DEBUG oslo_concurrency.lockutils [req-a981a4d7-5d6b-4d67-ae73-dadfbfe183b7 req-0639dcc9-202e-43ff-8302-966229e98ca4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.459 185914 DEBUG nova.compute.manager [req-a981a4d7-5d6b-4d67-ae73-dadfbfe183b7 req-0639dcc9-202e-43ff-8302-966229e98ca4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Processing event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.502 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249709.5018911, 4433d998-a1da-44d3-ae35-b75895398b1f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.503 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] VM Started (Lifecycle Event)
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.505 185914 DEBUG nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.511 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.515 185914 INFO nova.virt.libvirt.driver [-] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Instance spawned successfully.
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.516 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.524 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.528 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.542 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.542 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.543 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.544 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.544 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.545 185914 DEBUG nova.virt.libvirt.driver [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.554 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.555 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249709.5030072, 4433d998-a1da-44d3-ae35-b75895398b1f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.555 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] VM Paused (Lifecycle Event)
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.589 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.591 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249709.5099463, 4433d998-a1da-44d3-ae35-b75895398b1f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.592 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] VM Resumed (Lifecycle Event)
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.617 185914 INFO nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Took 6.68 seconds to spawn the instance on the hypervisor.
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.618 185914 DEBUG nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.619 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.625 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.669 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.671 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.671 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.672 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.673 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.699 185914 INFO nova.compute.manager [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Took 7.34 seconds to build instance.
Feb 16 13:48:29 compute-1 nova_compute[185910]: 2026-02-16 13:48:29.717 185914 DEBUG oslo_concurrency.lockutils [None req-c9e392a9-786b-47f7-a516-41f9b1ef7d8f e19cd2d8a8894526ba620ca3249e9a63 76c271745e704d5fa97fe16a7dcd4a81 - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.428s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:29 compute-1 podman[215010]: 2026-02-16 13:48:29.771287043 +0000 UTC m=+0.042344237 container create 213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 16 13:48:29 compute-1 systemd[1]: Started libpod-conmon-213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357.scope.
Feb 16 13:48:29 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:48:29 compute-1 podman[215010]: 2026-02-16 13:48:29.747263923 +0000 UTC m=+0.018321137 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:48:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a75bdb78476c49cfeb3ff22a9216fa3b94c402a7539c4326ce57df5b6f4de263/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:48:29 compute-1 podman[215010]: 2026-02-16 13:48:29.858193315 +0000 UTC m=+0.129250509 container init 213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:48:29 compute-1 podman[215010]: 2026-02-16 13:48:29.862842561 +0000 UTC m=+0.133899745 container start 213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:48:29 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215026]: [NOTICE]   (215030) : New worker (215032) forked
Feb 16 13:48:29 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215026]: [NOTICE]   (215030) : Loading success.
Feb 16 13:48:30 compute-1 nova_compute[185910]: 2026-02-16 13:48:30.307 185914 DEBUG nova.network.neutron [req-5a5ffaad-debe-44c7-9c95-f948cacd7dc4 req-f19e647f-3c71-4d41-8ef1-0d3dcced5cc6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updated VIF entry in instance network info cache for port 4005b3ce-3d4d-4741-91d2-940ee880a617. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:48:30 compute-1 nova_compute[185910]: 2026-02-16 13:48:30.308 185914 DEBUG nova.network.neutron [req-5a5ffaad-debe-44c7-9c95-f948cacd7dc4 req-f19e647f-3c71-4d41-8ef1-0d3dcced5cc6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updating instance_info_cache with network_info: [{"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:48:30 compute-1 nova_compute[185910]: 2026-02-16 13:48:30.324 185914 DEBUG oslo_concurrency.lockutils [req-5a5ffaad-debe-44c7-9c95-f948cacd7dc4 req-f19e647f-3c71-4d41-8ef1-0d3dcced5cc6 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:48:31 compute-1 nova_compute[185910]: 2026-02-16 13:48:31.551 185914 DEBUG nova.compute.manager [req-b99fc245-9001-47ce-a839-b8e24bca5eb7 req-71bf6965-f477-4ea3-b9a3-d39eddae71d1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:48:31 compute-1 nova_compute[185910]: 2026-02-16 13:48:31.552 185914 DEBUG oslo_concurrency.lockutils [req-b99fc245-9001-47ce-a839-b8e24bca5eb7 req-71bf6965-f477-4ea3-b9a3-d39eddae71d1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:48:31 compute-1 nova_compute[185910]: 2026-02-16 13:48:31.553 185914 DEBUG oslo_concurrency.lockutils [req-b99fc245-9001-47ce-a839-b8e24bca5eb7 req-71bf6965-f477-4ea3-b9a3-d39eddae71d1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:48:31 compute-1 nova_compute[185910]: 2026-02-16 13:48:31.553 185914 DEBUG oslo_concurrency.lockutils [req-b99fc245-9001-47ce-a839-b8e24bca5eb7 req-71bf6965-f477-4ea3-b9a3-d39eddae71d1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:48:31 compute-1 nova_compute[185910]: 2026-02-16 13:48:31.554 185914 DEBUG nova.compute.manager [req-b99fc245-9001-47ce-a839-b8e24bca5eb7 req-71bf6965-f477-4ea3-b9a3-d39eddae71d1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:48:31 compute-1 nova_compute[185910]: 2026-02-16 13:48:31.554 185914 WARNING nova.compute.manager [req-b99fc245-9001-47ce-a839-b8e24bca5eb7 req-71bf6965-f477-4ea3-b9a3-d39eddae71d1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received unexpected event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with vm_state active and task_state None.
Feb 16 13:48:33 compute-1 nova_compute[185910]: 2026-02-16 13:48:33.176 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:34 compute-1 nova_compute[185910]: 2026-02-16 13:48:34.645 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:34 compute-1 nova_compute[185910]: 2026-02-16 13:48:34.646 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:48:34 compute-1 nova_compute[185910]: 2026-02-16 13:48:34.668 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:48:34 compute-1 nova_compute[185910]: 2026-02-16 13:48:34.674 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:35 compute-1 podman[195236]: time="2026-02-16T13:48:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:48:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:48:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:48:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:48:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 16 13:48:37 compute-1 podman[215042]: 2026-02-16 13:48:37.940767121 +0000 UTC m=+0.066964804 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 16 13:48:37 compute-1 podman[215041]: 2026-02-16 13:48:37.944712617 +0000 UTC m=+0.083828369 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1770267347, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 16 13:48:38 compute-1 nova_compute[185910]: 2026-02-16 13:48:38.181 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:38 compute-1 nova_compute[185910]: 2026-02-16 13:48:38.219 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:38.218 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:48:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:38.220 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:48:38 compute-1 nova_compute[185910]: 2026-02-16 13:48:38.655 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:38 compute-1 nova_compute[185910]: 2026-02-16 13:48:38.656 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:48:39 compute-1 nova_compute[185910]: 2026-02-16 13:48:39.676 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:42 compute-1 ovn_controller[96285]: 2026-02-16T13:48:42Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1c:67:1b 10.100.0.11
Feb 16 13:48:42 compute-1 ovn_controller[96285]: 2026-02-16T13:48:42Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:67:1b 10.100.0.11
Feb 16 13:48:43 compute-1 nova_compute[185910]: 2026-02-16 13:48:43.184 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:43 compute-1 nova_compute[185910]: 2026-02-16 13:48:43.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:48:43 compute-1 nova_compute[185910]: 2026-02-16 13:48:43.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:48:44 compute-1 nova_compute[185910]: 2026-02-16 13:48:44.678 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:45 compute-1 podman[215094]: 2026-02-16 13:48:45.989594943 +0000 UTC m=+0.113876932 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 16 13:48:47 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:48:47.221 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:48:48 compute-1 nova_compute[185910]: 2026-02-16 13:48:48.191 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:49 compute-1 openstack_network_exporter[198096]: ERROR   13:48:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:48:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:48:49 compute-1 openstack_network_exporter[198096]: ERROR   13:48:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:48:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:48:49 compute-1 nova_compute[185910]: 2026-02-16 13:48:49.681 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:53 compute-1 nova_compute[185910]: 2026-02-16 13:48:53.194 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:53 compute-1 podman[215122]: 2026-02-16 13:48:53.934128722 +0000 UTC m=+0.064245279 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:48:54 compute-1 nova_compute[185910]: 2026-02-16 13:48:54.683 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:58 compute-1 nova_compute[185910]: 2026-02-16 13:48:58.197 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:48:59 compute-1 nova_compute[185910]: 2026-02-16 13:48:59.686 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:03 compute-1 nova_compute[185910]: 2026-02-16 13:49:03.200 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:03.360 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:03.361 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:03.362 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:04 compute-1 sshd-session[215148]: Invalid user postgres from 188.166.42.159 port 45654
Feb 16 13:49:04 compute-1 nova_compute[185910]: 2026-02-16 13:49:04.688 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:04 compute-1 sshd-session[215148]: Connection closed by invalid user postgres 188.166.42.159 port 45654 [preauth]
Feb 16 13:49:05 compute-1 podman[195236]: time="2026-02-16T13:49:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:49:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:49:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:49:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:49:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 13:49:07 compute-1 ovn_controller[96285]: 2026-02-16T13:49:07Z|00171|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 16 13:49:08 compute-1 nova_compute[185910]: 2026-02-16 13:49:08.204 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:08 compute-1 podman[215152]: 2026-02-16 13:49:08.903631354 +0000 UTC m=+0.045822086 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:49:08 compute-1 podman[215151]: 2026-02-16 13:49:08.911356072 +0000 UTC m=+0.054193182 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, release=1770267347, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter)
Feb 16 13:49:09 compute-1 nova_compute[185910]: 2026-02-16 13:49:09.689 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:13 compute-1 nova_compute[185910]: 2026-02-16 13:49:13.254 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:14 compute-1 nova_compute[185910]: 2026-02-16 13:49:14.692 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:16 compute-1 podman[215191]: 2026-02-16 13:49:16.96136682 +0000 UTC m=+0.097832008 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 13:49:18 compute-1 nova_compute[185910]: 2026-02-16 13:49:18.257 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:19 compute-1 openstack_network_exporter[198096]: ERROR   13:49:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:49:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:49:19 compute-1 openstack_network_exporter[198096]: ERROR   13:49:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:49:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:49:19 compute-1 nova_compute[185910]: 2026-02-16 13:49:19.693 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:23 compute-1 nova_compute[185910]: 2026-02-16 13:49:23.260 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:24 compute-1 nova_compute[185910]: 2026-02-16 13:49:24.655 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:24 compute-1 nova_compute[185910]: 2026-02-16 13:49:24.695 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:24 compute-1 podman[215218]: 2026-02-16 13:49:24.914786874 +0000 UTC m=+0.055290082 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:49:25 compute-1 nova_compute[185910]: 2026-02-16 13:49:25.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:26 compute-1 nova_compute[185910]: 2026-02-16 13:49:26.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:27 compute-1 nova_compute[185910]: 2026-02-16 13:49:27.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:27 compute-1 nova_compute[185910]: 2026-02-16 13:49:27.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.264 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.666 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.667 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.668 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.668 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.749 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.816 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.817 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:28 compute-1 nova_compute[185910]: 2026-02-16 13:49:28.899 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.109 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.111 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5657MB free_disk=73.1941146850586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.112 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.112 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.549 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 4433d998-a1da-44d3-ae35-b75895398b1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.550 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.551 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.645 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.668 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.698 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.711 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:49:29 compute-1 nova_compute[185910]: 2026-02-16 13:49:29.712 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.707 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.708 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.708 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.708 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.725 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.726 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.726 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.726 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4433d998-a1da-44d3-ae35-b75895398b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.922 185914 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Check if temp file /var/lib/nova/instances/tmpez75_a0r exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:49:31 compute-1 nova_compute[185910]: 2026-02-16 13:49:31.923 185914 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpez75_a0r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4433d998-a1da-44d3-ae35-b75895398b1f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:49:32 compute-1 sshd-session[215249]: Invalid user test from 146.190.226.24 port 60894
Feb 16 13:49:32 compute-1 nova_compute[185910]: 2026-02-16 13:49:32.543 185914 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:32 compute-1 nova_compute[185910]: 2026-02-16 13:49:32.593 185914 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:32 compute-1 nova_compute[185910]: 2026-02-16 13:49:32.594 185914 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:49:32 compute-1 nova_compute[185910]: 2026-02-16 13:49:32.640 185914 DEBUG oslo_concurrency.processutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:49:32 compute-1 sshd-session[215249]: Connection closed by invalid user test 146.190.226.24 port 60894 [preauth]
Feb 16 13:49:33 compute-1 nova_compute[185910]: 2026-02-16 13:49:33.267 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:34 compute-1 nova_compute[185910]: 2026-02-16 13:49:34.700 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:35 compute-1 nova_compute[185910]: 2026-02-16 13:49:35.092 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updating instance_info_cache with network_info: [{"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:49:35 compute-1 nova_compute[185910]: 2026-02-16 13:49:35.120 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:49:35 compute-1 nova_compute[185910]: 2026-02-16 13:49:35.121 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:49:35 compute-1 podman[195236]: time="2026-02-16T13:49:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:49:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:49:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:49:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:49:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 16 13:49:38 compute-1 sshd-session[215257]: Accepted publickey for nova from 192.168.122.100 port 52570 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:49:38 compute-1 systemd-logind[821]: New session 46 of user nova.
Feb 16 13:49:38 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:49:38 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:49:38 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:49:38 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:49:38 compute-1 systemd[215261]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:49:38 compute-1 systemd[215261]: Queued start job for default target Main User Target.
Feb 16 13:49:38 compute-1 systemd[215261]: Created slice User Application Slice.
Feb 16 13:49:38 compute-1 systemd[215261]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:49:38 compute-1 systemd[215261]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:49:38 compute-1 systemd[215261]: Reached target Paths.
Feb 16 13:49:38 compute-1 systemd[215261]: Reached target Timers.
Feb 16 13:49:38 compute-1 nova_compute[185910]: 2026-02-16 13:49:38.270 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:38 compute-1 systemd[215261]: Starting D-Bus User Message Bus Socket...
Feb 16 13:49:38 compute-1 systemd[215261]: Starting Create User's Volatile Files and Directories...
Feb 16 13:49:38 compute-1 systemd[215261]: Finished Create User's Volatile Files and Directories.
Feb 16 13:49:38 compute-1 systemd[215261]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:49:38 compute-1 systemd[215261]: Reached target Sockets.
Feb 16 13:49:38 compute-1 systemd[215261]: Reached target Basic System.
Feb 16 13:49:38 compute-1 systemd[215261]: Reached target Main User Target.
Feb 16 13:49:38 compute-1 systemd[215261]: Startup finished in 157ms.
Feb 16 13:49:38 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:49:38 compute-1 systemd[1]: Started Session 46 of User nova.
Feb 16 13:49:38 compute-1 sshd-session[215257]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:49:38 compute-1 sshd-session[215275]: Received disconnect from 192.168.122.100 port 52570:11: disconnected by user
Feb 16 13:49:38 compute-1 sshd-session[215275]: Disconnected from user nova 192.168.122.100 port 52570
Feb 16 13:49:38 compute-1 sshd-session[215257]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:49:38 compute-1 systemd[1]: session-46.scope: Deactivated successfully.
Feb 16 13:49:38 compute-1 systemd-logind[821]: Session 46 logged out. Waiting for processes to exit.
Feb 16 13:49:38 compute-1 systemd-logind[821]: Removed session 46.
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.422 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:39.423 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:39.424 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:49:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:39.424 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.462 185914 DEBUG nova.compute.manager [req-d93d8303-990d-457e-8ff3-e79687383ec1 req-1fca14c3-4bfb-4d20-8247-16f8d98e54de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.463 185914 DEBUG oslo_concurrency.lockutils [req-d93d8303-990d-457e-8ff3-e79687383ec1 req-1fca14c3-4bfb-4d20-8247-16f8d98e54de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.463 185914 DEBUG oslo_concurrency.lockutils [req-d93d8303-990d-457e-8ff3-e79687383ec1 req-1fca14c3-4bfb-4d20-8247-16f8d98e54de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.464 185914 DEBUG oslo_concurrency.lockutils [req-d93d8303-990d-457e-8ff3-e79687383ec1 req-1fca14c3-4bfb-4d20-8247-16f8d98e54de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.464 185914 DEBUG nova.compute.manager [req-d93d8303-990d-457e-8ff3-e79687383ec1 req-1fca14c3-4bfb-4d20-8247-16f8d98e54de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.464 185914 DEBUG nova.compute.manager [req-d93d8303-990d-457e-8ff3-e79687383ec1 req-1fca14c3-4bfb-4d20-8247-16f8d98e54de faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.660 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.660 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:49:39 compute-1 nova_compute[185910]: 2026-02-16 13:49:39.701 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:39 compute-1 podman[215279]: 2026-02-16 13:49:39.947763605 +0000 UTC m=+0.074889700 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:49:39 compute-1 podman[215278]: 2026-02-16 13:49:39.967000033 +0000 UTC m=+0.094796046 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347)
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.192 185914 INFO nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Took 7.55 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.193 185914 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.213 185914 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpez75_a0r',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4433d998-a1da-44d3-ae35-b75895398b1f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(10e57f75-538f-490b-996e-b4c84b87e270),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.237 185914 DEBUG nova.objects.instance [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 4433d998-a1da-44d3-ae35-b75895398b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.239 185914 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.243 185914 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.244 185914 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.305 185914 DEBUG nova.virt.libvirt.vif [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-313355006',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-313355006',id=23,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:48:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-0v5zmrzs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:48:29Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=4433d998-a1da-44d3-ae35-b75895398b1f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.306 185914 DEBUG nova.network.os_vif_util [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.308 185914 DEBUG nova.network.os_vif_util [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.309 185914 DEBUG nova.virt.libvirt.migration [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:49:40 compute-1 nova_compute[185910]:   <mac address="fa:16:3e:1c:67:1b"/>
Feb 16 13:49:40 compute-1 nova_compute[185910]:   <model type="virtio"/>
Feb 16 13:49:40 compute-1 nova_compute[185910]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:49:40 compute-1 nova_compute[185910]:   <mtu size="1442"/>
Feb 16 13:49:40 compute-1 nova_compute[185910]:   <target dev="tap4005b3ce-3d"/>
Feb 16 13:49:40 compute-1 nova_compute[185910]: </interface>
Feb 16 13:49:40 compute-1 nova_compute[185910]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.310 185914 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.747 185914 DEBUG nova.virt.libvirt.migration [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.748 185914 INFO nova.virt.libvirt.migration [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:49:40 compute-1 nova_compute[185910]: 2026-02-16 13:49:40.846 185914 INFO nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.350 185914 DEBUG nova.virt.libvirt.migration [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.351 185914 DEBUG nova.virt.libvirt.migration [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.562 185914 DEBUG nova.compute.manager [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.563 185914 DEBUG oslo_concurrency.lockutils [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.563 185914 DEBUG oslo_concurrency.lockutils [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.563 185914 DEBUG oslo_concurrency.lockutils [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.564 185914 DEBUG nova.compute.manager [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.564 185914 WARNING nova.compute.manager [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received unexpected event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with vm_state active and task_state migrating.
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.564 185914 DEBUG nova.compute.manager [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-changed-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.565 185914 DEBUG nova.compute.manager [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Refreshing instance network info cache due to event network-changed-4005b3ce-3d4d-4741-91d2-940ee880a617. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.565 185914 DEBUG oslo_concurrency.lockutils [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.565 185914 DEBUG oslo_concurrency.lockutils [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.565 185914 DEBUG nova.network.neutron [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Refreshing network info cache for port 4005b3ce-3d4d-4741-91d2-940ee880a617 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.856 185914 DEBUG nova.virt.libvirt.migration [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:49:41 compute-1 nova_compute[185910]: 2026-02-16 13:49:41.858 185914 DEBUG nova.virt.libvirt.migration [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.097 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249782.0968304, 4433d998-a1da-44d3-ae35-b75895398b1f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.098 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] VM Paused (Lifecycle Event)
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.127 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.132 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.155 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:49:42 compute-1 kernel: tap4005b3ce-3d (unregistering): left promiscuous mode
Feb 16 13:49:42 compute-1 NetworkManager[56388]: <info>  [1771249782.2229] device (tap4005b3ce-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00172|binding|INFO|Releasing lport 4005b3ce-3d4d-4741-91d2-940ee880a617 from this chassis (sb_readonly=0)
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00173|binding|INFO|Setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 down in Southbound
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00174|binding|INFO|Removing iface tap4005b3ce-3d ovn-installed in OVS
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.233 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.241 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:67:1b 10.100.0.11'], port_security=['fa:16:3e:1c:67:1b 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4433d998-a1da-44d3-ae35-b75895398b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=4005b3ce-3d4d-4741-91d2-940ee880a617) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.242 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 4005b3ce-3d4d-4741-91d2-940ee880a617 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.244 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.245 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.247 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f71beab5-5f46-4327-a9c4-d1f08a11a45e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.248 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 namespace which is not needed anymore
Feb 16 13:49:42 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000017.scope: Deactivated successfully.
Feb 16 13:49:42 compute-1 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000017.scope: Consumed 16.026s CPU time.
Feb 16 13:49:42 compute-1 systemd-machined[155419]: Machine qemu-15-instance-00000017 terminated.
Feb 16 13:49:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215026]: [NOTICE]   (215030) : haproxy version is 2.8.14-c23fe91
Feb 16 13:49:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215026]: [NOTICE]   (215030) : path to executable is /usr/sbin/haproxy
Feb 16 13:49:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215026]: [WARNING]  (215030) : Exiting Master process...
Feb 16 13:49:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215026]: [ALERT]    (215030) : Current worker (215032) exited with code 143 (Terminated)
Feb 16 13:49:42 compute-1 neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8[215026]: [WARNING]  (215030) : All workers exited. Exiting... (0)
Feb 16 13:49:42 compute-1 systemd[1]: libpod-213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357.scope: Deactivated successfully.
Feb 16 13:49:42 compute-1 podman[215364]: 2026-02-16 13:49:42.38759445 +0000 UTC m=+0.053190595 container died 213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:49:42 compute-1 kernel: tap4005b3ce-3d: entered promiscuous mode
Feb 16 13:49:42 compute-1 systemd-udevd[215345]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:49:42 compute-1 NetworkManager[56388]: <info>  [1771249782.4243] manager: (tap4005b3ce-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Feb 16 13:49:42 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357-userdata-shm.mount: Deactivated successfully.
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00175|binding|INFO|Claiming lport 4005b3ce-3d4d-4741-91d2-940ee880a617 for this chassis.
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00176|binding|INFO|4005b3ce-3d4d-4741-91d2-940ee880a617: Claiming fa:16:3e:1c:67:1b 10.100.0.11
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.427 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-a75bdb78476c49cfeb3ff22a9216fa3b94c402a7539c4326ce57df5b6f4de263-merged.mount: Deactivated successfully.
Feb 16 13:49:42 compute-1 kernel: tap4005b3ce-3d (unregistering): left promiscuous mode
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00177|binding|INFO|Setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 ovn-installed in OVS
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00178|binding|INFO|Setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 up in Southbound
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.440 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.445 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:67:1b 10.100.0.11'], port_security=['fa:16:3e:1c:67:1b 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4433d998-a1da-44d3-ae35-b75895398b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=4005b3ce-3d4d-4741-91d2-940ee880a617) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00179|binding|INFO|Releasing lport 4005b3ce-3d4d-4741-91d2-940ee880a617 from this chassis (sb_readonly=0)
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00180|binding|INFO|Setting lport 4005b3ce-3d4d-4741-91d2-940ee880a617 down in Southbound
Feb 16 13:49:42 compute-1 ovn_controller[96285]: 2026-02-16T13:49:42Z|00181|binding|INFO|Removing iface tap4005b3ce-3d ovn-installed in OVS
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.446 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:42 compute-1 podman[215364]: 2026-02-16 13:49:42.450741762 +0000 UTC m=+0.116337887 container cleanup 213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.449 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.454 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:67:1b 10.100.0.11'], port_security=['fa:16:3e:1c:67:1b 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4433d998-a1da-44d3-ae35-b75895398b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '76c271745e704d5fa97fe16a7dcd4a81', 'neutron:revision_number': '8', 'neutron:security_group_ids': '832f9d98-96fb-45e2-8c11-0f4534711ee2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb3ae2f5-242d-4288-83f4-c053c76499e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=4005b3ce-3d4d-4741-91d2-940ee880a617) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.455 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:42 compute-1 systemd[1]: libpod-conmon-213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357.scope: Deactivated successfully.
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.469 185914 DEBUG nova.virt.libvirt.guest [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.469 185914 INFO nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Migration operation has completed
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.470 185914 INFO nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] _post_live_migration() is started..
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.476 185914 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.477 185914 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.477 185914 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:49:42 compute-1 podman[215401]: 2026-02-16 13:49:42.522908307 +0000 UTC m=+0.048323763 container remove 213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.529 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[10b0d7c7-d570-43be-9b08-9fa86ae1f707]: (4, ('Mon Feb 16 01:49:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357)\n213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357\nMon Feb 16 01:49:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 (213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357)\n213d0513d643bf7b1f7e133c4e83217cde144a059fa448990937b2fa36e18357\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.532 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a57650-03ae-48a1-ac3b-6508c5e6dfb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.533 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62a1ccdd-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.535 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:42 compute-1 kernel: tap62a1ccdd-30: left promiscuous mode
Feb 16 13:49:42 compute-1 nova_compute[185910]: 2026-02-16 13:49:42.545 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.556 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[3b122efd-be27-451c-8968-65ddd8836b89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.583 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[f84c8222-8dd8-4313-bd3e-f60f62652f3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.585 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc83d4a-9ce7-4a1e-bbb1-c07cb4edd3a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.602 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d5ca36-ad86-455d-8efb-148220321a58]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 579368, 'reachable_time': 21540, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215423, 'error': None, 'target': 'ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.607 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62a1ccdd-3048-4bbf-acc8-c791bff79ee8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.607 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[0044ec1a-3152-4181-9ecc-2a61260d05e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.608 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 4005b3ce-3d4d-4741-91d2-940ee880a617 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:49:42 compute-1 systemd[1]: run-netns-ovnmeta\x2d62a1ccdd\x2d3048\x2d4bbf\x2dacc8\x2dc791bff79ee8.mount: Deactivated successfully.
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.610 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.611 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb8258f-cb58-4b37-87fd-cd907e0dfb21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.612 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 4005b3ce-3d4d-4741-91d2-940ee880a617 in datapath 62a1ccdd-3048-4bbf-acc8-c791bff79ee8 unbound from our chassis
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.613 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62a1ccdd-3048-4bbf-acc8-c791bff79ee8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:49:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:49:42.614 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[10ea7cb1-2c10-4db2-a262-b3a97bcfc0b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.272 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.651 185914 DEBUG nova.network.neutron [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updated VIF entry in instance network info cache for port 4005b3ce-3d4d-4741-91d2-940ee880a617. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.652 185914 DEBUG nova.network.neutron [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Updating instance_info_cache with network_info: [{"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.683 185914 DEBUG oslo_concurrency.lockutils [req-6a6a4cb9-7d67-42bf-9492-ce6961a15504 req-4af4658c-0dc8-44b9-b514-7251f9a81c82 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-4433d998-a1da-44d3-ae35-b75895398b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.714 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.715 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.715 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.715 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.715 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.715 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.716 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.716 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.716 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.716 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.716 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.716 185914 WARNING nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received unexpected event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with vm_state active and task_state migrating.
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.717 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.717 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.717 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.717 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.717 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.717 185914 WARNING nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received unexpected event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with vm_state active and task_state migrating.
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.718 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.718 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.718 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.718 185914 DEBUG oslo_concurrency.lockutils [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.718 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:43 compute-1 nova_compute[185910]: 2026-02-16 13:49:43.718 185914 DEBUG nova.compute.manager [req-b60e176b-fee6-4351-b058-14bfe08be922 req-c731b12f-635a-42b0-b9c4-b9e3ca4ed1ae faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-unplugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.022 185914 DEBUG nova.network.neutron [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port 4005b3ce-3d4d-4741-91d2-940ee880a617 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.023 185914 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.024 185914 DEBUG nova.virt.libvirt.vif [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:48:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-313355006',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-313355006',id=23,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:48:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='76c271745e704d5fa97fe16a7dcd4a81',ramdisk_id='',reservation_id='r-0v5zmrzs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1085993185',owner_user_name='tempest-TestExecuteStrategies-1085993185-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:49:29Z,user_data=None,user_id='e19cd2d8a8894526ba620ca3249e9a63',uuid=4433d998-a1da-44d3-ae35-b75895398b1f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.024 185914 DEBUG nova.network.os_vif_util [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "4005b3ce-3d4d-4741-91d2-940ee880a617", "address": "fa:16:3e:1c:67:1b", "network": {"id": "62a1ccdd-3048-4bbf-acc8-c791bff79ee8", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1774579383-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76c271745e704d5fa97fe16a7dcd4a81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4005b3ce-3d", "ovs_interfaceid": "4005b3ce-3d4d-4741-91d2-940ee880a617", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.026 185914 DEBUG nova.network.os_vif_util [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.026 185914 DEBUG os_vif [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.028 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.028 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4005b3ce-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.030 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.032 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.036 185914 INFO os_vif [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:67:1b,bridge_name='br-int',has_traffic_filtering=True,id=4005b3ce-3d4d-4741-91d2-940ee880a617,network=Network(62a1ccdd-3048-4bbf-acc8-c791bff79ee8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4005b3ce-3d')
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.037 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.038 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.038 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.039 185914 DEBUG nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.039 185914 INFO nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Deleting instance files /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f_del
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.040 185914 INFO nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Deletion of /var/lib/nova/instances/4433d998-a1da-44d3-ae35-b75895398b1f_del complete
Feb 16 13:49:44 compute-1 nova_compute[185910]: 2026-02-16 13:49:44.704 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.855 185914 DEBUG nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.856 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.856 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.856 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.857 185914 DEBUG nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.857 185914 WARNING nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received unexpected event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with vm_state active and task_state migrating.
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.857 185914 DEBUG nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.857 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.857 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.858 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.858 185914 DEBUG nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.858 185914 WARNING nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received unexpected event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with vm_state active and task_state migrating.
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.858 185914 DEBUG nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.858 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.859 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.859 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.859 185914 DEBUG nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.859 185914 WARNING nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received unexpected event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with vm_state active and task_state migrating.
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.859 185914 DEBUG nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.859 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.860 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.860 185914 DEBUG oslo_concurrency.lockutils [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.860 185914 DEBUG nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] No waiting events found dispatching network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:49:45 compute-1 nova_compute[185910]: 2026-02-16 13:49:45.860 185914 WARNING nova.compute.manager [req-b09bc647-a27b-483a-8ae9-f6c5f4605836 req-2963cfcb-5128-4ae6-a672-0978cbca27d5 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Received unexpected event network-vif-plugged-4005b3ce-3d4d-4741-91d2-940ee880a617 for instance with vm_state active and task_state migrating.
Feb 16 13:49:47 compute-1 podman[215424]: 2026-02-16 13:49:47.940829876 +0000 UTC m=+0.085373903 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:49:48 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:49:48 compute-1 systemd[215261]: Activating special unit Exit the Session...
Feb 16 13:49:48 compute-1 systemd[215261]: Stopped target Main User Target.
Feb 16 13:49:48 compute-1 systemd[215261]: Stopped target Basic System.
Feb 16 13:49:48 compute-1 systemd[215261]: Stopped target Paths.
Feb 16 13:49:48 compute-1 systemd[215261]: Stopped target Sockets.
Feb 16 13:49:48 compute-1 systemd[215261]: Stopped target Timers.
Feb 16 13:49:48 compute-1 systemd[215261]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:49:48 compute-1 systemd[215261]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:49:48 compute-1 systemd[215261]: Closed D-Bus User Message Bus Socket.
Feb 16 13:49:48 compute-1 systemd[215261]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:49:48 compute-1 systemd[215261]: Removed slice User Application Slice.
Feb 16 13:49:48 compute-1 systemd[215261]: Reached target Shutdown.
Feb 16 13:49:48 compute-1 systemd[215261]: Finished Exit the Session.
Feb 16 13:49:48 compute-1 systemd[215261]: Reached target Exit the Session.
Feb 16 13:49:48 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:49:48 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:49:48 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:49:48 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:49:48 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:49:48 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:49:48 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.032 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.277 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.278 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.278 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "4433d998-a1da-44d3-ae35-b75895398b1f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.314 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.315 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.315 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.315 185914 DEBUG nova.compute.resource_tracker [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:49:49 compute-1 openstack_network_exporter[198096]: ERROR   13:49:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:49:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:49:49 compute-1 openstack_network_exporter[198096]: ERROR   13:49:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:49:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.468 185914 WARNING nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.469 185914 DEBUG nova.compute.resource_tracker [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5777MB free_disk=73.22310638427734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.469 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.469 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.538 185914 DEBUG nova.compute.resource_tracker [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance 4433d998-a1da-44d3-ae35-b75895398b1f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.564 185914 DEBUG nova.compute.resource_tracker [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.602 185914 DEBUG nova.compute.resource_tracker [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration 10e57f75-538f-490b-996e-b4c84b87e270 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.603 185914 DEBUG nova.compute.resource_tracker [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.603 185914 DEBUG nova.compute.resource_tracker [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.655 185914 DEBUG nova.compute.provider_tree [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.677 185914 DEBUG nova.scheduler.client.report [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.706 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.711 185914 DEBUG nova.compute.resource_tracker [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.712 185914 DEBUG oslo_concurrency.lockutils [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.717 185914 INFO nova.compute.manager [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.821 185914 INFO nova.scheduler.client.report [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration 10e57f75-538f-490b-996e-b4c84b87e270
Feb 16 13:49:49 compute-1 nova_compute[185910]: 2026-02-16 13:49:49.822 185914 DEBUG nova.virt.libvirt.driver [None req-11fb7a2f-7445-4f74-80a4-7fcd93c341ba bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:49:54 compute-1 nova_compute[185910]: 2026-02-16 13:49:54.037 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:54 compute-1 nova_compute[185910]: 2026-02-16 13:49:54.708 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:55 compute-1 sshd-session[215455]: Invalid user postgres from 188.166.42.159 port 37890
Feb 16 13:49:55 compute-1 podman[215457]: 2026-02-16 13:49:55.347136161 +0000 UTC m=+0.068691142 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:49:55 compute-1 sshd-session[215455]: Connection closed by invalid user postgres 188.166.42.159 port 37890 [preauth]
Feb 16 13:49:57 compute-1 nova_compute[185910]: 2026-02-16 13:49:57.469 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249782.4685924, 4433d998-a1da-44d3-ae35-b75895398b1f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:49:57 compute-1 nova_compute[185910]: 2026-02-16 13:49:57.469 185914 INFO nova.compute.manager [-] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] VM Stopped (Lifecycle Event)
Feb 16 13:49:57 compute-1 nova_compute[185910]: 2026-02-16 13:49:57.502 185914 DEBUG nova.compute.manager [None req-83a0f5c5-8616-4e8c-b724-fad5bd6d0191 - - - - - -] [instance: 4433d998-a1da-44d3-ae35-b75895398b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:49:59 compute-1 nova_compute[185910]: 2026-02-16 13:49:59.081 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:49:59 compute-1 nova_compute[185910]: 2026-02-16 13:49:59.710 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:50:03.362 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:50:03.362 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:50:03.362 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:04 compute-1 nova_compute[185910]: 2026-02-16 13:50:04.082 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:04 compute-1 nova_compute[185910]: 2026-02-16 13:50:04.712 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:05 compute-1 podman[195236]: time="2026-02-16T13:50:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:50:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:50:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:50:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:50:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:50:09 compute-1 nova_compute[185910]: 2026-02-16 13:50:09.084 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:09 compute-1 nova_compute[185910]: 2026-02-16 13:50:09.714 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:10 compute-1 podman[215483]: 2026-02-16 13:50:10.919363638 +0000 UTC m=+0.051110348 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:50:10 compute-1 podman[215482]: 2026-02-16 13:50:10.93241412 +0000 UTC m=+0.066952065 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 13:50:14 compute-1 nova_compute[185910]: 2026-02-16 13:50:14.130 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:14 compute-1 nova_compute[185910]: 2026-02-16 13:50:14.716 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:17 compute-1 nova_compute[185910]: 2026-02-16 13:50:17.243 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:18 compute-1 podman[215522]: 2026-02-16 13:50:18.944578466 +0000 UTC m=+0.083655496 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:50:19 compute-1 nova_compute[185910]: 2026-02-16 13:50:19.133 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:19 compute-1 openstack_network_exporter[198096]: ERROR   13:50:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:50:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:50:19 compute-1 openstack_network_exporter[198096]: ERROR   13:50:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:50:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:50:19 compute-1 nova_compute[185910]: 2026-02-16 13:50:19.718 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:24 compute-1 nova_compute[185910]: 2026-02-16 13:50:24.135 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:24 compute-1 nova_compute[185910]: 2026-02-16 13:50:24.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:24 compute-1 nova_compute[185910]: 2026-02-16 13:50:24.721 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:25 compute-1 podman[215548]: 2026-02-16 13:50:25.909248777 +0000 UTC m=+0.047714936 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:50:26 compute-1 nova_compute[185910]: 2026-02-16 13:50:26.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:27 compute-1 nova_compute[185910]: 2026-02-16 13:50:27.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:28 compute-1 nova_compute[185910]: 2026-02-16 13:50:28.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:29 compute-1 nova_compute[185910]: 2026-02-16 13:50:29.192 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:29 compute-1 nova_compute[185910]: 2026-02-16 13:50:29.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:29 compute-1 nova_compute[185910]: 2026-02-16 13:50:29.723 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:30 compute-1 nova_compute[185910]: 2026-02-16 13:50:30.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:30 compute-1 nova_compute[185910]: 2026-02-16 13:50:30.663 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:30 compute-1 nova_compute[185910]: 2026-02-16 13:50:30.664 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:30 compute-1 nova_compute[185910]: 2026-02-16 13:50:30.664 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:30 compute-1 nova_compute[185910]: 2026-02-16 13:50:30.664 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:50:30 compute-1 nova_compute[185910]: 2026-02-16 13:50:30.847 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:50:30 compute-1 nova_compute[185910]: 2026-02-16 13:50:30.848 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5802MB free_disk=73.22310638427734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:50:30 compute-1 nova_compute[185910]: 2026-02-16 13:50:30.848 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:50:30 compute-1 nova_compute[185910]: 2026-02-16 13:50:30.849 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:50:31 compute-1 nova_compute[185910]: 2026-02-16 13:50:31.068 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:50:31 compute-1 nova_compute[185910]: 2026-02-16 13:50:31.068 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:50:31 compute-1 nova_compute[185910]: 2026-02-16 13:50:31.091 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:50:31 compute-1 nova_compute[185910]: 2026-02-16 13:50:31.111 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:50:31 compute-1 nova_compute[185910]: 2026-02-16 13:50:31.112 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:50:31 compute-1 nova_compute[185910]: 2026-02-16 13:50:31.113 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:50:33 compute-1 nova_compute[185910]: 2026-02-16 13:50:33.107 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:33 compute-1 nova_compute[185910]: 2026-02-16 13:50:33.109 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:33 compute-1 nova_compute[185910]: 2026-02-16 13:50:33.109 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:50:33 compute-1 nova_compute[185910]: 2026-02-16 13:50:33.110 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:50:33 compute-1 nova_compute[185910]: 2026-02-16 13:50:33.136 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:50:34 compute-1 nova_compute[185910]: 2026-02-16 13:50:34.195 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:34 compute-1 nova_compute[185910]: 2026-02-16 13:50:34.725 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:35 compute-1 podman[195236]: time="2026-02-16T13:50:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:50:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:50:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:50:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:50:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:50:39 compute-1 nova_compute[185910]: 2026-02-16 13:50:39.246 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:39 compute-1 nova_compute[185910]: 2026-02-16 13:50:39.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:50:39 compute-1 nova_compute[185910]: 2026-02-16 13:50:39.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:50:39 compute-1 nova_compute[185910]: 2026-02-16 13:50:39.727 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:40 compute-1 sshd-session[215572]: Invalid user ubuntu from 2.57.122.210 port 46052
Feb 16 13:50:40 compute-1 sshd-session[215572]: Connection closed by invalid user ubuntu 2.57.122.210 port 46052 [preauth]
Feb 16 13:50:41 compute-1 sshd-session[215574]: Invalid user test from 146.190.226.24 port 41694
Feb 16 13:50:41 compute-1 podman[215577]: 2026-02-16 13:50:41.355510076 +0000 UTC m=+0.058633711 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 13:50:41 compute-1 podman[215576]: 2026-02-16 13:50:41.378830724 +0000 UTC m=+0.086696938 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Feb 16 13:50:41 compute-1 sshd-session[215574]: Connection closed by invalid user test 146.190.226.24 port 41694 [preauth]
Feb 16 13:50:43 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:50:43.582 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:50:43 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:50:43.583 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:50:43 compute-1 nova_compute[185910]: 2026-02-16 13:50:43.583 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:44 compute-1 nova_compute[185910]: 2026-02-16 13:50:44.295 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:44 compute-1 nova_compute[185910]: 2026-02-16 13:50:44.730 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:49 compute-1 nova_compute[185910]: 2026-02-16 13:50:49.342 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:49 compute-1 openstack_network_exporter[198096]: ERROR   13:50:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:50:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:50:49 compute-1 openstack_network_exporter[198096]: ERROR   13:50:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:50:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:50:49 compute-1 nova_compute[185910]: 2026-02-16 13:50:49.733 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:49 compute-1 podman[215615]: 2026-02-16 13:50:49.989426721 +0000 UTC m=+0.129859801 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 16 13:50:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:50:50.586 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:50:51 compute-1 ovn_controller[96285]: 2026-02-16T13:50:51Z|00182|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 13:50:53 compute-1 sshd-session[215642]: Invalid user postgres from 188.166.42.159 port 44606
Feb 16 13:50:54 compute-1 nova_compute[185910]: 2026-02-16 13:50:54.345 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:54 compute-1 sshd-session[215642]: Connection closed by invalid user postgres 188.166.42.159 port 44606 [preauth]
Feb 16 13:50:54 compute-1 nova_compute[185910]: 2026-02-16 13:50:54.735 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:56 compute-1 podman[215644]: 2026-02-16 13:50:56.904758432 +0000 UTC m=+0.050027650 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:50:59 compute-1 nova_compute[185910]: 2026-02-16 13:50:59.348 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:50:59 compute-1 nova_compute[185910]: 2026-02-16 13:50:59.736 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:01 compute-1 sshd-session[215668]: Connection closed by 45.148.10.121 port 48042 [preauth]
Feb 16 13:51:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:03.362 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:03.363 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:03.363 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:04 compute-1 nova_compute[185910]: 2026-02-16 13:51:04.365 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:04 compute-1 nova_compute[185910]: 2026-02-16 13:51:04.739 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:05 compute-1 podman[195236]: time="2026-02-16T13:51:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:51:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:51:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:51:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:51:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.420 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "e72c2fc7-4686-493c-ac27-4a865859dd3d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.421 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.437 185914 DEBUG nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.518 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.519 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.526 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.526 185914 INFO nova.compute.claims [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.642 185914 DEBUG nova.compute.provider_tree [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.661 185914 DEBUG nova.scheduler.client.report [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.699 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.700 185914 DEBUG nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.739 185914 DEBUG nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.740 185914 DEBUG nova.network.neutron [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.761 185914 INFO nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.786 185914 DEBUG nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.876 185914 DEBUG nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.877 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.878 185914 INFO nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Creating image(s)
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.879 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "/var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.879 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "/var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.880 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "/var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.894 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.961 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.963 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.963 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:07 compute-1 nova_compute[185910]: 2026-02-16 13:51:07.980 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.042 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.044 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.073 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.075 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.076 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.101 185914 DEBUG nova.policy [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7c8dce27a2f4917a7dac485b1d8754a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.125 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.126 185914 DEBUG nova.virt.disk.api [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Checking if we can resize image /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.126 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.181 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.182 185914 DEBUG nova.virt.disk.api [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Cannot resize image /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.183 185914 DEBUG nova.objects.instance [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lazy-loading 'migration_context' on Instance uuid e72c2fc7-4686-493c-ac27-4a865859dd3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.205 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.206 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Ensure instance console log exists: /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.207 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.208 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.208 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:08 compute-1 nova_compute[185910]: 2026-02-16 13:51:08.737 185914 DEBUG nova.network.neutron [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Successfully created port: 6fae2663-44ca-4efe-b29c-5df60eb75e74 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:51:09 compute-1 nova_compute[185910]: 2026-02-16 13:51:09.369 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:09 compute-1 nova_compute[185910]: 2026-02-16 13:51:09.741 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:09 compute-1 nova_compute[185910]: 2026-02-16 13:51:09.768 185914 DEBUG nova.network.neutron [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Successfully updated port: 6fae2663-44ca-4efe-b29c-5df60eb75e74 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:51:09 compute-1 nova_compute[185910]: 2026-02-16 13:51:09.785 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "refresh_cache-e72c2fc7-4686-493c-ac27-4a865859dd3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:51:09 compute-1 nova_compute[185910]: 2026-02-16 13:51:09.786 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquired lock "refresh_cache-e72c2fc7-4686-493c-ac27-4a865859dd3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:51:09 compute-1 nova_compute[185910]: 2026-02-16 13:51:09.786 185914 DEBUG nova.network.neutron [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:51:09 compute-1 nova_compute[185910]: 2026-02-16 13:51:09.868 185914 DEBUG nova.compute.manager [req-d277f3ce-aa93-42a8-acde-140aa37e6429 req-1cf95b2b-d06c-4740-bf2f-4fc3e74a73d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Received event network-changed-6fae2663-44ca-4efe-b29c-5df60eb75e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:09 compute-1 nova_compute[185910]: 2026-02-16 13:51:09.869 185914 DEBUG nova.compute.manager [req-d277f3ce-aa93-42a8-acde-140aa37e6429 req-1cf95b2b-d06c-4740-bf2f-4fc3e74a73d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Refreshing instance network info cache due to event network-changed-6fae2663-44ca-4efe-b29c-5df60eb75e74. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:51:09 compute-1 nova_compute[185910]: 2026-02-16 13:51:09.869 185914 DEBUG oslo_concurrency.lockutils [req-d277f3ce-aa93-42a8-acde-140aa37e6429 req-1cf95b2b-d06c-4740-bf2f-4fc3e74a73d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-e72c2fc7-4686-493c-ac27-4a865859dd3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.072 185914 DEBUG nova.network.neutron [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.847 185914 DEBUG nova.network.neutron [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Updating instance_info_cache with network_info: [{"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.868 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Releasing lock "refresh_cache-e72c2fc7-4686-493c-ac27-4a865859dd3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.869 185914 DEBUG nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Instance network_info: |[{"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.870 185914 DEBUG oslo_concurrency.lockutils [req-d277f3ce-aa93-42a8-acde-140aa37e6429 req-1cf95b2b-d06c-4740-bf2f-4fc3e74a73d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-e72c2fc7-4686-493c-ac27-4a865859dd3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.870 185914 DEBUG nova.network.neutron [req-d277f3ce-aa93-42a8-acde-140aa37e6429 req-1cf95b2b-d06c-4740-bf2f-4fc3e74a73d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Refreshing network info cache for port 6fae2663-44ca-4efe-b29c-5df60eb75e74 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.872 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Start _get_guest_xml network_info=[{"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.877 185914 WARNING nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.882 185914 DEBUG nova.virt.libvirt.host [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.882 185914 DEBUG nova.virt.libvirt.host [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.886 185914 DEBUG nova.virt.libvirt.host [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.886 185914 DEBUG nova.virt.libvirt.host [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.887 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.887 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.888 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.888 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.888 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.888 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.888 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.889 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.889 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.889 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.889 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.890 185914 DEBUG nova.virt.hardware [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.893 185914 DEBUG nova.virt.libvirt.vif [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:51:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-965605715',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-965605715',id=26,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c4a5b3f08ab466eaac86305d91fd9a8',ramdisk_id='',reservation_id='r-xljyumbn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:51:07Z,user_data=None,user_id='c7c8dce27a2f4917a7dac485b1d8754a',uuid=e72c2fc7-4686-493c-ac27-4a865859dd3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.893 185914 DEBUG nova.network.os_vif_util [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converting VIF {"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.894 185914 DEBUG nova.network.os_vif_util [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:be:11,bridge_name='br-int',has_traffic_filtering=True,id=6fae2663-44ca-4efe-b29c-5df60eb75e74,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fae2663-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.894 185914 DEBUG nova.objects.instance [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lazy-loading 'pci_devices' on Instance uuid e72c2fc7-4686-493c-ac27-4a865859dd3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.925 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <uuid>e72c2fc7-4686-493c-ac27-4a865859dd3d</uuid>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <name>instance-0000001a</name>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-965605715</nova:name>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:51:10</nova:creationTime>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:51:10 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:51:10 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:51:10 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:51:10 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:51:10 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:51:10 compute-1 nova_compute[185910]:         <nova:user uuid="c7c8dce27a2f4917a7dac485b1d8754a">tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member</nova:user>
Feb 16 13:51:10 compute-1 nova_compute[185910]:         <nova:project uuid="5c4a5b3f08ab466eaac86305d91fd9a8">tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259</nova:project>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:51:10 compute-1 nova_compute[185910]:         <nova:port uuid="6fae2663-44ca-4efe-b29c-5df60eb75e74">
Feb 16 13:51:10 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <system>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <entry name="serial">e72c2fc7-4686-493c-ac27-4a865859dd3d</entry>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <entry name="uuid">e72c2fc7-4686-493c-ac27-4a865859dd3d</entry>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     </system>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <os>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   </os>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <features>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   </features>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk.config"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:cc:be:11"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <target dev="tap6fae2663-44"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/console.log" append="off"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <video>
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     </video>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:51:10 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:51:10 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:51:10 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:51:10 compute-1 nova_compute[185910]: </domain>
Feb 16 13:51:10 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.926 185914 DEBUG nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Preparing to wait for external event network-vif-plugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.926 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.926 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.926 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.927 185914 DEBUG nova.virt.libvirt.vif [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:51:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-965605715',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-965605715',id=26,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5c4a5b3f08ab466eaac86305d91fd9a8',ramdisk_id='',reservation_id='r-xljyumbn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:51:07Z,user_data=None,user_id='c7c8dce27a2f4917a7dac485b1d8754a',uuid=e72c2fc7-4686-493c-ac27-4a865859dd3d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.927 185914 DEBUG nova.network.os_vif_util [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converting VIF {"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.928 185914 DEBUG nova.network.os_vif_util [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:be:11,bridge_name='br-int',has_traffic_filtering=True,id=6fae2663-44ca-4efe-b29c-5df60eb75e74,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fae2663-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.928 185914 DEBUG os_vif [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:be:11,bridge_name='br-int',has_traffic_filtering=True,id=6fae2663-44ca-4efe-b29c-5df60eb75e74,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fae2663-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.929 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.929 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.929 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.936 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.936 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6fae2663-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.936 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6fae2663-44, col_values=(('external_ids', {'iface-id': '6fae2663-44ca-4efe-b29c-5df60eb75e74', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:be:11', 'vm-uuid': 'e72c2fc7-4686-493c-ac27-4a865859dd3d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.938 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:10 compute-1 NetworkManager[56388]: <info>  [1771249870.9399] manager: (tap6fae2663-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.941 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.945 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:10 compute-1 nova_compute[185910]: 2026-02-16 13:51:10.946 185914 INFO os_vif [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:be:11,bridge_name='br-int',has_traffic_filtering=True,id=6fae2663-44ca-4efe-b29c-5df60eb75e74,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fae2663-44')
Feb 16 13:51:11 compute-1 nova_compute[185910]: 2026-02-16 13:51:11.009 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:51:11 compute-1 nova_compute[185910]: 2026-02-16 13:51:11.010 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:51:11 compute-1 nova_compute[185910]: 2026-02-16 13:51:11.010 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] No VIF found with MAC fa:16:3e:cc:be:11, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:51:11 compute-1 nova_compute[185910]: 2026-02-16 13:51:11.010 185914 INFO nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Using config drive
Feb 16 13:51:11 compute-1 podman[215688]: 2026-02-16 13:51:11.907948521 +0000 UTC m=+0.052985299 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 13:51:11 compute-1 podman[215689]: 2026-02-16 13:51:11.938966747 +0000 UTC m=+0.078521837 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:51:13 compute-1 nova_compute[185910]: 2026-02-16 13:51:13.068 185914 INFO nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Creating config drive at /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk.config
Feb 16 13:51:13 compute-1 nova_compute[185910]: 2026-02-16 13:51:13.074 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9f53adfs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:13 compute-1 nova_compute[185910]: 2026-02-16 13:51:13.204 185914 DEBUG oslo_concurrency.processutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9f53adfs" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:13 compute-1 kernel: tap6fae2663-44: entered promiscuous mode
Feb 16 13:51:13 compute-1 NetworkManager[56388]: <info>  [1771249873.2736] manager: (tap6fae2663-44): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Feb 16 13:51:13 compute-1 ovn_controller[96285]: 2026-02-16T13:51:13Z|00183|binding|INFO|Claiming lport 6fae2663-44ca-4efe-b29c-5df60eb75e74 for this chassis.
Feb 16 13:51:13 compute-1 ovn_controller[96285]: 2026-02-16T13:51:13Z|00184|binding|INFO|6fae2663-44ca-4efe-b29c-5df60eb75e74: Claiming fa:16:3e:cc:be:11 10.100.0.14
Feb 16 13:51:13 compute-1 nova_compute[185910]: 2026-02-16 13:51:13.276 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.290 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:be:11 10.100.0.14'], port_security=['fa:16:3e:cc:be:11 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e72c2fc7-4686-493c-ac27-4a865859dd3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25f604b5-711f-4df5-a65b-4ca0c988350f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '238bb162-7cdb-4292-ac0d-7fe46bc858a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a96acfaa-4a70-40f4-bebe-b6fb536cb5a3, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=6fae2663-44ca-4efe-b29c-5df60eb75e74) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.292 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 6fae2663-44ca-4efe-b29c-5df60eb75e74 in datapath 25f604b5-711f-4df5-a65b-4ca0c988350f bound to our chassis
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.295 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 25f604b5-711f-4df5-a65b-4ca0c988350f
Feb 16 13:51:13 compute-1 nova_compute[185910]: 2026-02-16 13:51:13.303 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:13 compute-1 ovn_controller[96285]: 2026-02-16T13:51:13Z|00185|binding|INFO|Setting lport 6fae2663-44ca-4efe-b29c-5df60eb75e74 ovn-installed in OVS
Feb 16 13:51:13 compute-1 ovn_controller[96285]: 2026-02-16T13:51:13Z|00186|binding|INFO|Setting lport 6fae2663-44ca-4efe-b29c-5df60eb75e74 up in Southbound
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.307 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc5e451-2a72-4a30-822d-acf177c4c8f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.309 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap25f604b5-71 in ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:51:13 compute-1 nova_compute[185910]: 2026-02-16 13:51:13.310 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.312 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap25f604b5-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.312 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[93917f90-5693-48c6-8ee7-3511dbf7d84b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.313 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[66ec4d87-91ed-4391-9d56-7b908e4e3eba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 systemd-machined[155419]: New machine qemu-16-instance-0000001a.
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.327 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[21b14600-bd28-475f-8b0d-c59c3ad089c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 systemd[1]: Started Virtual Machine qemu-16-instance-0000001a.
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.340 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c2eae8-f937-43fa-b391-a7199a11ea1c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 systemd-udevd[215749]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:51:13 compute-1 NetworkManager[56388]: <info>  [1771249873.3577] device (tap6fae2663-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:51:13 compute-1 NetworkManager[56388]: <info>  [1771249873.3586] device (tap6fae2663-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.373 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[ce31f81d-9356-4597-ba31-214b436a6894]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 systemd-udevd[215753]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.380 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[700c9c9f-674a-49a0-9761-4d52ec836764]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 NetworkManager[56388]: <info>  [1771249873.3829] manager: (tap25f604b5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.416 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[28346d86-e3b5-4952-9523-05f3af0ceec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.420 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[5ecc1afd-7c3b-4245-9127-a12b82af6324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 NetworkManager[56388]: <info>  [1771249873.4426] device (tap25f604b5-70): carrier: link connected
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.448 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[80b6fd46-443b-4e29-93fa-86ee1c814cc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.466 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[959a3a70-bf49-4ced-a05e-523578e8fed2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25f604b5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:17:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595786, 'reachable_time': 38473, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215779, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.478 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a03364b1-e3ab-45bd-92dd-4ffdfab5de63]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:17ad'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 595786, 'tstamp': 595786}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215780, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.491 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c87f4d-0e7f-49fe-834d-e2a6d17983b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25f604b5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:17:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595786, 'reachable_time': 38473, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215781, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.514 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[af6e8b37-002b-4b2b-989d-3e8ae0683d28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.554 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[005b480b-47f3-45e5-a224-d4fdfbe7b64b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.556 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25f604b5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.557 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.557 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25f604b5-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:13 compute-1 kernel: tap25f604b5-70: entered promiscuous mode
Feb 16 13:51:13 compute-1 NetworkManager[56388]: <info>  [1771249873.5617] manager: (tap25f604b5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.561 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap25f604b5-70, col_values=(('external_ids', {'iface-id': 'a43b300c-9dd2-4ad8-8dd7-aaeb277a3352'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:13 compute-1 ovn_controller[96285]: 2026-02-16T13:51:13Z|00187|binding|INFO|Releasing lport a43b300c-9dd2-4ad8-8dd7-aaeb277a3352 from this chassis (sb_readonly=0)
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.564 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/25f604b5-711f-4df5-a65b-4ca0c988350f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/25f604b5-711f-4df5-a65b-4ca0c988350f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.565 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[82cf3bc2-898f-4cc8-bded-d8a0fa78feab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.566 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-25f604b5-711f-4df5-a65b-4ca0c988350f
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/25f604b5-711f-4df5-a65b-4ca0c988350f.pid.haproxy
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 25f604b5-711f-4df5-a65b-4ca0c988350f
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:51:13 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:13.567 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'env', 'PROCESS_TAG=haproxy-25f604b5-711f-4df5-a65b-4ca0c988350f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/25f604b5-711f-4df5-a65b-4ca0c988350f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:51:13 compute-1 nova_compute[185910]: 2026-02-16 13:51:13.567 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:13 compute-1 podman[215814]: 2026-02-16 13:51:13.900556861 +0000 UTC m=+0.045394264 container create 020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:51:13 compute-1 systemd[1]: Started libpod-conmon-020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe.scope.
Feb 16 13:51:13 compute-1 podman[215814]: 2026-02-16 13:51:13.876750699 +0000 UTC m=+0.021588122 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:51:13 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:51:13 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61827dd2e2e8cb061c6e34812348b62529dd5e3984f1c674ba5da276507abf7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:51:13 compute-1 podman[215814]: 2026-02-16 13:51:13.993424874 +0000 UTC m=+0.138262297 container init 020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 16 13:51:14 compute-1 podman[215814]: 2026-02-16 13:51:14.001578724 +0000 UTC m=+0.146416127 container start 020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 13:51:14 compute-1 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[215830]: [NOTICE]   (215834) : New worker (215836) forked
Feb 16 13:51:14 compute-1 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[215830]: [NOTICE]   (215834) : Loading success.
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.232 185914 DEBUG nova.compute.manager [req-c5a616b2-3045-4281-b8a0-b04bba1d42f8 req-02a86d2a-3295-40d0-a3d5-9158e1734cb1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Received event network-vif-plugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.233 185914 DEBUG oslo_concurrency.lockutils [req-c5a616b2-3045-4281-b8a0-b04bba1d42f8 req-02a86d2a-3295-40d0-a3d5-9158e1734cb1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.235 185914 DEBUG oslo_concurrency.lockutils [req-c5a616b2-3045-4281-b8a0-b04bba1d42f8 req-02a86d2a-3295-40d0-a3d5-9158e1734cb1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.235 185914 DEBUG oslo_concurrency.lockutils [req-c5a616b2-3045-4281-b8a0-b04bba1d42f8 req-02a86d2a-3295-40d0-a3d5-9158e1734cb1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.235 185914 DEBUG nova.compute.manager [req-c5a616b2-3045-4281-b8a0-b04bba1d42f8 req-02a86d2a-3295-40d0-a3d5-9158e1734cb1 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Processing event network-vif-plugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.303 185914 DEBUG nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.304 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249874.3026679, e72c2fc7-4686-493c-ac27-4a865859dd3d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.304 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] VM Started (Lifecycle Event)
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.307 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.312 185914 INFO nova.virt.libvirt.driver [-] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Instance spawned successfully.
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.313 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.368 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.374 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.379 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.379 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.380 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.381 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.381 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.383 185914 DEBUG nova.virt.libvirt.driver [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.409 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.410 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249874.3036726, e72c2fc7-4686-493c-ac27-4a865859dd3d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.410 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] VM Paused (Lifecycle Event)
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.434 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.439 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249874.3069582, e72c2fc7-4686-493c-ac27-4a865859dd3d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.439 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] VM Resumed (Lifecycle Event)
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.455 185914 INFO nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Took 6.58 seconds to spawn the instance on the hypervisor.
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.456 185914 DEBUG nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.488 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.494 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.532 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.544 185914 INFO nova.compute.manager [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Took 7.06 seconds to build instance.
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.560 185914 DEBUG oslo_concurrency.lockutils [None req-3b12a8dd-ff76-4baf-8bfa-4956fb5fd114 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:14 compute-1 nova_compute[185910]: 2026-02-16 13:51:14.744 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:15 compute-1 nova_compute[185910]: 2026-02-16 13:51:15.333 185914 DEBUG nova.network.neutron [req-d277f3ce-aa93-42a8-acde-140aa37e6429 req-1cf95b2b-d06c-4740-bf2f-4fc3e74a73d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Updated VIF entry in instance network info cache for port 6fae2663-44ca-4efe-b29c-5df60eb75e74. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:51:15 compute-1 nova_compute[185910]: 2026-02-16 13:51:15.334 185914 DEBUG nova.network.neutron [req-d277f3ce-aa93-42a8-acde-140aa37e6429 req-1cf95b2b-d06c-4740-bf2f-4fc3e74a73d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Updating instance_info_cache with network_info: [{"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:15 compute-1 nova_compute[185910]: 2026-02-16 13:51:15.364 185914 DEBUG oslo_concurrency.lockutils [req-d277f3ce-aa93-42a8-acde-140aa37e6429 req-1cf95b2b-d06c-4740-bf2f-4fc3e74a73d2 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-e72c2fc7-4686-493c-ac27-4a865859dd3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:51:15 compute-1 nova_compute[185910]: 2026-02-16 13:51:15.939 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:16 compute-1 nova_compute[185910]: 2026-02-16 13:51:16.336 185914 DEBUG nova.compute.manager [req-7f4fc673-19e2-476d-a2b9-190eadeb133d req-2334f546-2417-4f8d-99b6-3efdd0907a9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Received event network-vif-plugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:16 compute-1 nova_compute[185910]: 2026-02-16 13:51:16.336 185914 DEBUG oslo_concurrency.lockutils [req-7f4fc673-19e2-476d-a2b9-190eadeb133d req-2334f546-2417-4f8d-99b6-3efdd0907a9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:16 compute-1 nova_compute[185910]: 2026-02-16 13:51:16.337 185914 DEBUG oslo_concurrency.lockutils [req-7f4fc673-19e2-476d-a2b9-190eadeb133d req-2334f546-2417-4f8d-99b6-3efdd0907a9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:16 compute-1 nova_compute[185910]: 2026-02-16 13:51:16.337 185914 DEBUG oslo_concurrency.lockutils [req-7f4fc673-19e2-476d-a2b9-190eadeb133d req-2334f546-2417-4f8d-99b6-3efdd0907a9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:16 compute-1 nova_compute[185910]: 2026-02-16 13:51:16.337 185914 DEBUG nova.compute.manager [req-7f4fc673-19e2-476d-a2b9-190eadeb133d req-2334f546-2417-4f8d-99b6-3efdd0907a9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] No waiting events found dispatching network-vif-plugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:16 compute-1 nova_compute[185910]: 2026-02-16 13:51:16.337 185914 WARNING nova.compute.manager [req-7f4fc673-19e2-476d-a2b9-190eadeb133d req-2334f546-2417-4f8d-99b6-3efdd0907a9b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Received unexpected event network-vif-plugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 for instance with vm_state active and task_state None.
Feb 16 13:51:19 compute-1 openstack_network_exporter[198096]: ERROR   13:51:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:51:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:51:19 compute-1 openstack_network_exporter[198096]: ERROR   13:51:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:51:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:51:19 compute-1 nova_compute[185910]: 2026-02-16 13:51:19.746 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:20 compute-1 podman[215852]: 2026-02-16 13:51:20.940005958 +0000 UTC m=+0.083190643 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 13:51:20 compute-1 nova_compute[185910]: 2026-02-16 13:51:20.941 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:24 compute-1 nova_compute[185910]: 2026-02-16 13:51:24.748 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:25 compute-1 nova_compute[185910]: 2026-02-16 13:51:25.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:25 compute-1 nova_compute[185910]: 2026-02-16 13:51:25.943 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:26 compute-1 nova_compute[185910]: 2026-02-16 13:51:26.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:27 compute-1 ovn_controller[96285]: 2026-02-16T13:51:27Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cc:be:11 10.100.0.14
Feb 16 13:51:27 compute-1 ovn_controller[96285]: 2026-02-16T13:51:27Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cc:be:11 10.100.0.14
Feb 16 13:51:27 compute-1 podman[215900]: 2026-02-16 13:51:27.923943019 +0000 UTC m=+0.057482131 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:51:28 compute-1 nova_compute[185910]: 2026-02-16 13:51:28.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:28 compute-1 nova_compute[185910]: 2026-02-16 13:51:28.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:29 compute-1 nova_compute[185910]: 2026-02-16 13:51:29.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:29 compute-1 nova_compute[185910]: 2026-02-16 13:51:29.750 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.125 185914 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Creating tmpfile /var/lib/nova/instances/tmpldl3rikf to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.283 185914 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpldl3rikf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.654 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.655 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.656 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.656 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.726 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.780 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.782 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.833 185914 DEBUG oslo_concurrency.processutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:30 compute-1 nova_compute[185910]: 2026-02-16 13:51:30.945 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.001 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.002 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5645MB free_disk=73.19440460205078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.002 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.003 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.064 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance e72c2fc7-4686-493c-ac27-4a865859dd3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.090 185914 WARNING nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Instance 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.091 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.091 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.112 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.141 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.142 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.165 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.189 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.256 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.281 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.315 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.315 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.680 185914 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpldl3rikf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b24deb5-a1f1-4154-a8a4-c31c69dc5d32',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.711 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.712 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:51:31 compute-1 nova_compute[185910]: 2026-02-16 13:51:31.712 185914 DEBUG nova.network.neutron [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.310 185914 DEBUG nova.network.neutron [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updating instance_info_cache with network_info: [{"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.354 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.357 185914 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpldl3rikf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b24deb5-a1f1-4154-a8a4-c31c69dc5d32',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.358 185914 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Creating instance directory: /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.358 185914 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Creating disk.info with the contents: {'/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk': 'qcow2', '/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.359 185914 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.361 185914 DEBUG nova.objects.instance [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.398 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.451 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.453 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.453 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.468 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.521 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.523 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.555 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.557 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.557 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.608 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.609 185914 DEBUG nova.virt.disk.api [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Checking if we can resize image /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.610 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.657 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.658 185914 DEBUG nova.virt.disk.api [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Cannot resize image /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.658 185914 DEBUG nova.objects.instance [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.674 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.696 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config 485376" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.698 185914 DEBUG nova.virt.libvirt.volume.remotefs [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config to /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 13:51:33 compute-1 nova_compute[185910]: 2026-02-16 13:51:33.698 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.100 185914 DEBUG oslo_concurrency.processutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32/disk.config /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.100 185914 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.102 185914 DEBUG nova.virt.libvirt.vif [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:50:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-985821402',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-985821402',id=25,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:50:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5c4a5b3f08ab466eaac86305d91fd9a8',ramdisk_id='',reservation_id='r-jwp5hm0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:50:59Z,user_data=None,user_id='c7c8dce27a2f4917a7dac485b1d8754a',uuid=6b24deb5-a1f1-4154-a8a4-c31c69dc5d32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.102 185914 DEBUG nova.network.os_vif_util [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.103 185914 DEBUG nova.network.os_vif_util [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.103 185914 DEBUG os_vif [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.104 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.104 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.105 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.107 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.107 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7a22eb4-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.108 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7a22eb4-a3, col_values=(('external_ids', {'iface-id': 'b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:81:d6', 'vm-uuid': '6b24deb5-a1f1-4154-a8a4-c31c69dc5d32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.109 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:34 compute-1 NetworkManager[56388]: <info>  [1771249894.1116] manager: (tapb7a22eb4-a3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.112 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.117 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.118 185914 INFO os_vif [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3')
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.118 185914 DEBUG nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.118 185914 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpldl3rikf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b24deb5-a1f1-4154-a8a4-c31c69dc5d32',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 13:51:34 compute-1 nova_compute[185910]: 2026-02-16 13:51:34.753 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.256 185914 DEBUG nova.network.neutron [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.259 185914 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpldl3rikf',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b24deb5-a1f1-4154-a8a4-c31c69dc5d32',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.310 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.310 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.311 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.311 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:51:35 compute-1 systemd[1]: Starting libvirt proxy daemon...
Feb 16 13:51:35 compute-1 systemd[1]: Started libvirt proxy daemon.
Feb 16 13:51:35 compute-1 kernel: tapb7a22eb4-a3: entered promiscuous mode
Feb 16 13:51:35 compute-1 NetworkManager[56388]: <info>  [1771249895.5455] manager: (tapb7a22eb4-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Feb 16 13:51:35 compute-1 ovn_controller[96285]: 2026-02-16T13:51:35Z|00188|binding|INFO|Claiming lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for this additional chassis.
Feb 16 13:51:35 compute-1 ovn_controller[96285]: 2026-02-16T13:51:35Z|00189|binding|INFO|b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1: Claiming fa:16:3e:cd:81:d6 10.100.0.12
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.546 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.554 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:35 compute-1 ovn_controller[96285]: 2026-02-16T13:51:35Z|00190|binding|INFO|Setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 ovn-installed in OVS
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.555 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.558 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:35 compute-1 systemd-machined[155419]: New machine qemu-17-instance-00000019.
Feb 16 13:51:35 compute-1 systemd[1]: Started Virtual Machine qemu-17-instance-00000019.
Feb 16 13:51:35 compute-1 systemd-udevd[215988]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.612 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-e72c2fc7-4686-493c-ac27-4a865859dd3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.612 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-e72c2fc7-4686-493c-ac27-4a865859dd3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.612 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:51:35 compute-1 nova_compute[185910]: 2026-02-16 13:51:35.613 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid e72c2fc7-4686-493c-ac27-4a865859dd3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:51:35 compute-1 NetworkManager[56388]: <info>  [1771249895.6199] device (tapb7a22eb4-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:51:35 compute-1 NetworkManager[56388]: <info>  [1771249895.6207] device (tapb7a22eb4-a3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:51:35 compute-1 podman[195236]: time="2026-02-16T13:51:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:51:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:51:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:51:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:51:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.245 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249896.2449799, 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.247 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] VM Started (Lifecycle Event)
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.274 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.837 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Updating instance_info_cache with network_info: [{"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.890 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-e72c2fc7-4686-493c-ac27-4a865859dd3d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.890 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.960 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249896.9596593, 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.961 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] VM Resumed (Lifecycle Event)
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.984 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:51:36 compute-1 nova_compute[185910]: 2026-02-16 13:51:36.988 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:51:37 compute-1 nova_compute[185910]: 2026-02-16 13:51:37.008 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com
Feb 16 13:51:38 compute-1 ovn_controller[96285]: 2026-02-16T13:51:38Z|00191|binding|INFO|Claiming lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for this chassis.
Feb 16 13:51:38 compute-1 ovn_controller[96285]: 2026-02-16T13:51:38Z|00192|binding|INFO|b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1: Claiming fa:16:3e:cd:81:d6 10.100.0.12
Feb 16 13:51:38 compute-1 ovn_controller[96285]: 2026-02-16T13:51:38Z|00193|binding|INFO|Setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 up in Southbound
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.114 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:81:d6 10.100.0.12'], port_security=['fa:16:3e:cd:81:d6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6b24deb5-a1f1-4154-a8a4-c31c69dc5d32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25f604b5-711f-4df5-a65b-4ca0c988350f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'neutron:revision_number': '11', 'neutron:security_group_ids': '238bb162-7cdb-4292-ac0d-7fe46bc858a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a96acfaa-4a70-40f4-bebe-b6fb536cb5a3, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.116 105573 INFO neutron.agent.ovn.metadata.agent [-] Port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 in datapath 25f604b5-711f-4df5-a65b-4ca0c988350f bound to our chassis
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.117 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 25f604b5-711f-4df5-a65b-4ca0c988350f
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.133 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[81720667-d476-464d-8e96-1a358fafd523]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.159 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[fa383df2-3e9d-405c-99de-19ad70b704d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.164 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[25599a89-a84a-4e0b-8e38-737de689a58a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.200 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[e759d65a-fc39-4dcb-bfdc-97ec073fc039]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.224 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[884da6a2-37f4-4c52-94d6-dbc7ceb8e3ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25f604b5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:17:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595786, 'reachable_time': 38473, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216022, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.245 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e838ae-b9e9-48bf-b16c-c6426359d71f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap25f604b5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 595795, 'tstamp': 595795}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216023, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap25f604b5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 595797, 'tstamp': 595797}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216023, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.247 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25f604b5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:38 compute-1 nova_compute[185910]: 2026-02-16 13:51:38.250 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.251 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25f604b5-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.252 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.252 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap25f604b5-70, col_values=(('external_ids', {'iface-id': 'a43b300c-9dd2-4ad8-8dd7-aaeb277a3352'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:38.253 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:51:38 compute-1 nova_compute[185910]: 2026-02-16 13:51:38.300 185914 INFO nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Post operation of migration started
Feb 16 13:51:38 compute-1 nova_compute[185910]: 2026-02-16 13:51:38.663 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:51:38 compute-1 nova_compute[185910]: 2026-02-16 13:51:38.663 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:51:38 compute-1 nova_compute[185910]: 2026-02-16 13:51:38.664 185914 DEBUG nova.network.neutron [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:51:39 compute-1 nova_compute[185910]: 2026-02-16 13:51:39.111 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:39 compute-1 nova_compute[185910]: 2026-02-16 13:51:39.756 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:39 compute-1 nova_compute[185910]: 2026-02-16 13:51:39.892 185914 DEBUG nova.network.neutron [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updating instance_info_cache with network_info: [{"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:39 compute-1 nova_compute[185910]: 2026-02-16 13:51:39.911 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:51:39 compute-1 nova_compute[185910]: 2026-02-16 13:51:39.930 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:39 compute-1 nova_compute[185910]: 2026-02-16 13:51:39.930 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:39 compute-1 nova_compute[185910]: 2026-02-16 13:51:39.931 185914 DEBUG oslo_concurrency.lockutils [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:39 compute-1 nova_compute[185910]: 2026-02-16 13:51:39.936 185914 INFO nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 13:51:39 compute-1 virtqemud[185025]: Domain id=17 name='instance-00000019' uuid=6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 is tainted: custom-monitor
Feb 16 13:51:40 compute-1 nova_compute[185910]: 2026-02-16 13:51:40.945 185914 INFO nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 13:51:41 compute-1 nova_compute[185910]: 2026-02-16 13:51:41.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:41 compute-1 nova_compute[185910]: 2026-02-16 13:51:41.657 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:51:41 compute-1 nova_compute[185910]: 2026-02-16 13:51:41.658 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:51:41 compute-1 nova_compute[185910]: 2026-02-16 13:51:41.953 185914 INFO nova.virt.libvirt.driver [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 13:51:41 compute-1 nova_compute[185910]: 2026-02-16 13:51:41.959 185914 DEBUG nova.compute.manager [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:51:41 compute-1 nova_compute[185910]: 2026-02-16 13:51:41.991 185914 DEBUG nova.objects.instance [None req-f0ec306f-424e-4dbf-8261-4a3536a2121f bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 13:51:42 compute-1 podman[216024]: 2026-02-16 13:51:42.933989722 +0000 UTC m=+0.066636627 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, release=1770267347, version=9.7, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:51:42 compute-1 podman[216025]: 2026-02-16 13:51:42.959848629 +0000 UTC m=+0.089837463 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 16 13:51:44 compute-1 nova_compute[185910]: 2026-02-16 13:51:44.114 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:44 compute-1 nova_compute[185910]: 2026-02-16 13:51:44.759 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.622 185914 DEBUG oslo_concurrency.lockutils [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "e72c2fc7-4686-493c-ac27-4a865859dd3d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.624 185914 DEBUG oslo_concurrency.lockutils [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.625 185914 DEBUG oslo_concurrency.lockutils [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.625 185914 DEBUG oslo_concurrency.lockutils [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.625 185914 DEBUG oslo_concurrency.lockutils [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.627 185914 INFO nova.compute.manager [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Terminating instance
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.628 185914 DEBUG nova.compute.manager [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:51:46 compute-1 kernel: tap6fae2663-44 (unregistering): left promiscuous mode
Feb 16 13:51:46 compute-1 NetworkManager[56388]: <info>  [1771249906.6860] device (tap6fae2663-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:51:46 compute-1 ovn_controller[96285]: 2026-02-16T13:51:46Z|00194|binding|INFO|Releasing lport 6fae2663-44ca-4efe-b29c-5df60eb75e74 from this chassis (sb_readonly=0)
Feb 16 13:51:46 compute-1 ovn_controller[96285]: 2026-02-16T13:51:46Z|00195|binding|INFO|Setting lport 6fae2663-44ca-4efe-b29c-5df60eb75e74 down in Southbound
Feb 16 13:51:46 compute-1 ovn_controller[96285]: 2026-02-16T13:51:46Z|00196|binding|INFO|Removing iface tap6fae2663-44 ovn-installed in OVS
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.698 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.701 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.710 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.737 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:be:11 10.100.0.14'], port_security=['fa:16:3e:cc:be:11 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e72c2fc7-4686-493c-ac27-4a865859dd3d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25f604b5-711f-4df5-a65b-4ca0c988350f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '238bb162-7cdb-4292-ac0d-7fe46bc858a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a96acfaa-4a70-40f4-bebe-b6fb536cb5a3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=6fae2663-44ca-4efe-b29c-5df60eb75e74) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.738 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 6fae2663-44ca-4efe-b29c-5df60eb75e74 in datapath 25f604b5-711f-4df5-a65b-4ca0c988350f unbound from our chassis
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.739 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 25f604b5-711f-4df5-a65b-4ca0c988350f
Feb 16 13:51:46 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Feb 16 13:51:46 compute-1 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001a.scope: Consumed 13.725s CPU time.
Feb 16 13:51:46 compute-1 systemd-machined[155419]: Machine qemu-16-instance-0000001a terminated.
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.752 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a5c2ae-ac62-45b1-8856-c8961de66356]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.790 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[a46ab1d1-97b4-4d69-8dd6-b0acf922737b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.795 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[32063faf-72a6-4073-a02e-667300345246]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.820 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[d17af2ed-7b78-4c20-be32-495e95c7df27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.835 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[2362c3fb-edc5-45d7-9b5f-9ef51a2c4974]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap25f604b5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:17:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595786, 'reachable_time': 38473, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216074, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.847 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[4321f6cb-855e-4320-9361-4acdd623cc35]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap25f604b5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 595795, 'tstamp': 595795}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216076, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap25f604b5-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 595797, 'tstamp': 595797}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216076, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.849 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25f604b5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.851 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.854 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.855 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25f604b5-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.856 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.857 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap25f604b5-70, col_values=(('external_ids', {'iface-id': 'a43b300c-9dd2-4ad8-8dd7-aaeb277a3352'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:46 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:46.857 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.891 185914 INFO nova.virt.libvirt.driver [-] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Instance destroyed successfully.
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.892 185914 DEBUG nova.objects.instance [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lazy-loading 'resources' on Instance uuid e72c2fc7-4686-493c-ac27-4a865859dd3d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.957 185914 DEBUG nova.virt.libvirt.vif [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:51:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-965605715',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-965605715',id=26,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:51:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c4a5b3f08ab466eaac86305d91fd9a8',ramdisk_id='',reservation_id='r-xljyumbn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:51:14Z,user_data=None,user_id='c7c8dce27a2f4917a7dac485b1d8754a',uuid=e72c2fc7-4686-493c-ac27-4a865859dd3d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.957 185914 DEBUG nova.network.os_vif_util [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converting VIF {"id": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "address": "fa:16:3e:cc:be:11", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6fae2663-44", "ovs_interfaceid": "6fae2663-44ca-4efe-b29c-5df60eb75e74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.958 185914 DEBUG nova.network.os_vif_util [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cc:be:11,bridge_name='br-int',has_traffic_filtering=True,id=6fae2663-44ca-4efe-b29c-5df60eb75e74,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fae2663-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.958 185914 DEBUG os_vif [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:be:11,bridge_name='br-int',has_traffic_filtering=True,id=6fae2663-44ca-4efe-b29c-5df60eb75e74,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fae2663-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.960 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:46 compute-1 nova_compute[185910]: 2026-02-16 13:51:46.960 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6fae2663-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.007 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.010 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.013 185914 INFO os_vif [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:be:11,bridge_name='br-int',has_traffic_filtering=True,id=6fae2663-44ca-4efe-b29c-5df60eb75e74,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6fae2663-44')
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.014 185914 INFO nova.virt.libvirt.driver [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Deleting instance files /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d_del
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.014 185914 INFO nova.virt.libvirt.driver [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Deletion of /var/lib/nova/instances/e72c2fc7-4686-493c-ac27-4a865859dd3d_del complete
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.199 185914 INFO nova.compute.manager [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Took 0.57 seconds to destroy the instance on the hypervisor.
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.200 185914 DEBUG oslo.service.loopingcall [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.200 185914 DEBUG nova.compute.manager [-] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.201 185914 DEBUG nova.network.neutron [-] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.403 185914 DEBUG nova.compute.manager [req-fa8290dd-8e33-47f7-978b-b25154311f54 req-bb9a8334-bb68-4534-8e93-3add6292782e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Received event network-vif-unplugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.404 185914 DEBUG oslo_concurrency.lockutils [req-fa8290dd-8e33-47f7-978b-b25154311f54 req-bb9a8334-bb68-4534-8e93-3add6292782e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.404 185914 DEBUG oslo_concurrency.lockutils [req-fa8290dd-8e33-47f7-978b-b25154311f54 req-bb9a8334-bb68-4534-8e93-3add6292782e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.404 185914 DEBUG oslo_concurrency.lockutils [req-fa8290dd-8e33-47f7-978b-b25154311f54 req-bb9a8334-bb68-4534-8e93-3add6292782e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.404 185914 DEBUG nova.compute.manager [req-fa8290dd-8e33-47f7-978b-b25154311f54 req-bb9a8334-bb68-4534-8e93-3add6292782e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] No waiting events found dispatching network-vif-unplugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.405 185914 DEBUG nova.compute.manager [req-fa8290dd-8e33-47f7-978b-b25154311f54 req-bb9a8334-bb68-4534-8e93-3add6292782e faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Received event network-vif-unplugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:51:47 compute-1 nova_compute[185910]: 2026-02-16 13:51:47.447 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:47 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:47.448 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:51:47 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:47.449 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:51:47 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:47.449 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:48 compute-1 sshd-session[216093]: Invalid user postgres from 188.166.42.159 port 52992
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.436 185914 DEBUG nova.network.neutron [-] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.462 185914 INFO nova.compute.manager [-] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Took 1.26 seconds to deallocate network for instance.
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.476 185914 DEBUG nova.compute.manager [req-a2d7750d-240a-493b-9c82-22e178220386 req-c9e6cf1f-a9eb-431b-963f-f03f2fd7d76d faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Received event network-vif-deleted-6fae2663-44ca-4efe-b29c-5df60eb75e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.528 185914 DEBUG oslo_concurrency.lockutils [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.528 185914 DEBUG oslo_concurrency.lockutils [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:48 compute-1 sshd-session[216093]: Connection closed by invalid user postgres 188.166.42.159 port 52992 [preauth]
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.615 185914 DEBUG nova.compute.provider_tree [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.638 185914 DEBUG nova.scheduler.client.report [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.658 185914 DEBUG oslo_concurrency.lockutils [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.679 185914 INFO nova.scheduler.client.report [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Deleted allocations for instance e72c2fc7-4686-493c-ac27-4a865859dd3d
Feb 16 13:51:48 compute-1 nova_compute[185910]: 2026-02-16 13:51:48.746 185914 DEBUG oslo_concurrency.lockutils [None req-ba70345c-6381-437d-ac18-14a7eba7e8cd c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:49 compute-1 sshd-session[216095]: Invalid user test from 146.190.226.24 port 51392
Feb 16 13:51:49 compute-1 sshd-session[216095]: Connection closed by invalid user test 146.190.226.24 port 51392 [preauth]
Feb 16 13:51:49 compute-1 openstack_network_exporter[198096]: ERROR   13:51:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:51:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:51:49 compute-1 openstack_network_exporter[198096]: ERROR   13:51:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:51:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:51:49 compute-1 nova_compute[185910]: 2026-02-16 13:51:49.553 185914 DEBUG nova.compute.manager [req-896546fa-0a9e-4660-abac-158f503a034e req-9bf234e0-1484-4aa0-84d4-c60eb592126f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Received event network-vif-plugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:49 compute-1 nova_compute[185910]: 2026-02-16 13:51:49.553 185914 DEBUG oslo_concurrency.lockutils [req-896546fa-0a9e-4660-abac-158f503a034e req-9bf234e0-1484-4aa0-84d4-c60eb592126f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:49 compute-1 nova_compute[185910]: 2026-02-16 13:51:49.553 185914 DEBUG oslo_concurrency.lockutils [req-896546fa-0a9e-4660-abac-158f503a034e req-9bf234e0-1484-4aa0-84d4-c60eb592126f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:49 compute-1 nova_compute[185910]: 2026-02-16 13:51:49.553 185914 DEBUG oslo_concurrency.lockutils [req-896546fa-0a9e-4660-abac-158f503a034e req-9bf234e0-1484-4aa0-84d4-c60eb592126f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "e72c2fc7-4686-493c-ac27-4a865859dd3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:49 compute-1 nova_compute[185910]: 2026-02-16 13:51:49.554 185914 DEBUG nova.compute.manager [req-896546fa-0a9e-4660-abac-158f503a034e req-9bf234e0-1484-4aa0-84d4-c60eb592126f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] No waiting events found dispatching network-vif-plugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:49 compute-1 nova_compute[185910]: 2026-02-16 13:51:49.554 185914 WARNING nova.compute.manager [req-896546fa-0a9e-4660-abac-158f503a034e req-9bf234e0-1484-4aa0-84d4-c60eb592126f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Received unexpected event network-vif-plugged-6fae2663-44ca-4efe-b29c-5df60eb75e74 for instance with vm_state deleted and task_state None.
Feb 16 13:51:49 compute-1 nova_compute[185910]: 2026-02-16 13:51:49.762 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.103 185914 DEBUG oslo_concurrency.lockutils [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.104 185914 DEBUG oslo_concurrency.lockutils [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.104 185914 DEBUG oslo_concurrency.lockutils [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.104 185914 DEBUG oslo_concurrency.lockutils [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.104 185914 DEBUG oslo_concurrency.lockutils [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.106 185914 INFO nova.compute.manager [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Terminating instance
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.107 185914 DEBUG nova.compute.manager [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 13:51:50 compute-1 kernel: tapb7a22eb4-a3 (unregistering): left promiscuous mode
Feb 16 13:51:50 compute-1 NetworkManager[56388]: <info>  [1771249910.1338] device (tapb7a22eb4-a3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.134 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.139 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00197|binding|INFO|Releasing lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 from this chassis (sb_readonly=0)
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00198|binding|INFO|Setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 down in Southbound
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00199|binding|INFO|Removing iface tapb7a22eb4-a3 ovn-installed in OVS
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.142 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.146 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.147 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:81:d6 10.100.0.12'], port_security=['fa:16:3e:cd:81:d6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6b24deb5-a1f1-4154-a8a4-c31c69dc5d32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25f604b5-711f-4df5-a65b-4ca0c988350f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '238bb162-7cdb-4292-ac0d-7fe46bc858a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a96acfaa-4a70-40f4-bebe-b6fb536cb5a3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.150 105573 INFO neutron.agent.ovn.metadata.agent [-] Port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 in datapath 25f604b5-711f-4df5-a65b-4ca0c988350f unbound from our chassis
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.151 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25f604b5-711f-4df5-a65b-4ca0c988350f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.152 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ecefc54e-8b17-413c-b456-1cc098fbc75c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.153 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f namespace which is not needed anymore
Feb 16 13:51:50 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000019.scope: Deactivated successfully.
Feb 16 13:51:50 compute-1 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000019.scope: Consumed 1.655s CPU time.
Feb 16 13:51:50 compute-1 systemd-machined[155419]: Machine qemu-17-instance-00000019 terminated.
Feb 16 13:51:50 compute-1 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[215830]: [NOTICE]   (215834) : haproxy version is 2.8.14-c23fe91
Feb 16 13:51:50 compute-1 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[215830]: [NOTICE]   (215834) : path to executable is /usr/sbin/haproxy
Feb 16 13:51:50 compute-1 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[215830]: [WARNING]  (215834) : Exiting Master process...
Feb 16 13:51:50 compute-1 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[215830]: [ALERT]    (215834) : Current worker (215836) exited with code 143 (Terminated)
Feb 16 13:51:50 compute-1 neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f[215830]: [WARNING]  (215834) : All workers exited. Exiting... (0)
Feb 16 13:51:50 compute-1 systemd[1]: libpod-020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe.scope: Deactivated successfully.
Feb 16 13:51:50 compute-1 podman[216121]: 2026-02-16 13:51:50.289812627 +0000 UTC m=+0.044417738 container died 020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:51:50 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe-userdata-shm.mount: Deactivated successfully.
Feb 16 13:51:50 compute-1 systemd[1]: var-lib-containers-storage-overlay-61827dd2e2e8cb061c6e34812348b62529dd5e3984f1c674ba5da276507abf7c-merged.mount: Deactivated successfully.
Feb 16 13:51:50 compute-1 kernel: tapb7a22eb4-a3: entered promiscuous mode
Feb 16 13:51:50 compute-1 NetworkManager[56388]: <info>  [1771249910.3259] manager: (tapb7a22eb4-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Feb 16 13:51:50 compute-1 systemd-udevd[216103]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:51:50 compute-1 kernel: tapb7a22eb4-a3 (unregistering): left promiscuous mode
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.328 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00200|binding|INFO|Claiming lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for this chassis.
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00201|binding|INFO|b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1: Claiming fa:16:3e:cd:81:d6 10.100.0.12
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.338 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:81:d6 10.100.0.12'], port_security=['fa:16:3e:cd:81:d6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6b24deb5-a1f1-4154-a8a4-c31c69dc5d32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25f604b5-711f-4df5-a65b-4ca0c988350f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '238bb162-7cdb-4292-ac0d-7fe46bc858a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a96acfaa-4a70-40f4-bebe-b6fb536cb5a3, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:51:50 compute-1 podman[216121]: 2026-02-16 13:51:50.342659872 +0000 UTC m=+0.097264983 container cleanup 020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00202|binding|INFO|Setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 ovn-installed in OVS
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00203|binding|INFO|Setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 up in Southbound
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.344 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00204|binding|INFO|Releasing lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 from this chassis (sb_readonly=1)
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00205|binding|INFO|Removing iface tapb7a22eb4-a3 ovn-installed in OVS
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00206|if_status|INFO|Dropped 3 log messages in last 787 seconds (most recently, 787 seconds ago) due to excessive rate
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00207|if_status|INFO|Not setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 down as sb is readonly
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00208|binding|INFO|Releasing lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 from this chassis (sb_readonly=0)
Feb 16 13:51:50 compute-1 ovn_controller[96285]: 2026-02-16T13:51:50Z|00209|binding|INFO|Setting lport b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 down in Southbound
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.347 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.350 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 systemd[1]: libpod-conmon-020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe.scope: Deactivated successfully.
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.351 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:81:d6 10.100.0.12'], port_security=['fa:16:3e:cd:81:d6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '6b24deb5-a1f1-4154-a8a4-c31c69dc5d32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25f604b5-711f-4df5-a65b-4ca0c988350f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c4a5b3f08ab466eaac86305d91fd9a8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '238bb162-7cdb-4292-ac0d-7fe46bc858a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a96acfaa-4a70-40f4-bebe-b6fb536cb5a3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.372 185914 INFO nova.virt.libvirt.driver [-] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Instance destroyed successfully.
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.373 185914 DEBUG nova.objects.instance [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lazy-loading 'resources' on Instance uuid 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.390 185914 DEBUG nova.virt.libvirt.vif [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T13:50:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-985821402',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-985821402',id=25,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:50:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5c4a5b3f08ab466eaac86305d91fd9a8',ramdisk_id='',reservation_id='r-jwp5hm0l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1500862259-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:51:42Z,user_data=None,user_id='c7c8dce27a2f4917a7dac485b1d8754a',uuid=6b24deb5-a1f1-4154-a8a4-c31c69dc5d32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.390 185914 DEBUG nova.network.os_vif_util [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converting VIF {"id": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "address": "fa:16:3e:cd:81:d6", "network": {"id": "25f604b5-711f-4df5-a65b-4ca0c988350f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1415641352-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c4a5b3f08ab466eaac86305d91fd9a8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7a22eb4-a3", "ovs_interfaceid": "b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.391 185914 DEBUG nova.network.os_vif_util [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.391 185914 DEBUG os_vif [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.393 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.393 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7a22eb4-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.395 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.396 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.398 185914 INFO os_vif [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cd:81:d6,bridge_name='br-int',has_traffic_filtering=True,id=b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1,network=Network(25f604b5-711f-4df5-a65b-4ca0c988350f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7a22eb4-a3')
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.399 185914 INFO nova.virt.libvirt.driver [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Deleting instance files /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32_del
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.399 185914 INFO nova.virt.libvirt.driver [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Deletion of /var/lib/nova/instances/6b24deb5-a1f1-4154-a8a4-c31c69dc5d32_del complete
Feb 16 13:51:50 compute-1 podman[216157]: 2026-02-16 13:51:50.408529267 +0000 UTC m=+0.042670481 container remove 020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.412 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5b95fd80-dd74-4866-aa1e-da0ef2b2ff22]: (4, ('Mon Feb 16 01:51:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f (020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe)\n020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe\nMon Feb 16 01:51:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f (020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe)\n020642fdee414f5da95f3c98bb58a0dd66b7845f5632a950e971e9df9a2bfbbe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.413 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8cecd1c5-70bb-4dce-9924-5ef850b1cf5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.414 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25f604b5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.416 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 kernel: tap25f604b5-70: left promiscuous mode
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.420 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.420 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.423 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e46080eb-0173-4703-8452-b809d7f927f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.442 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[59c83e5e-60cc-48ff-9db9-201e216c9c48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.445 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[0138047f-481b-4efe-a51b-c375f53c6dc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.452 185914 INFO nova.compute.manager [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.453 185914 DEBUG oslo.service.loopingcall [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.454 185914 DEBUG nova.compute.manager [-] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 13:51:50 compute-1 nova_compute[185910]: 2026-02-16 13:51:50.454 185914 DEBUG nova.network.neutron [-] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.458 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a334ed-293a-42c8-833f-38440868baea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595778, 'reachable_time': 41914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216175, 'error': None, 'target': 'ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.461 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-25f604b5-711f-4df5-a65b-4ca0c988350f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.461 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6381b8-78ea-479f-8f27-9239e5aad41d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.462 105573 INFO neutron.agent.ovn.metadata.agent [-] Port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 in datapath 25f604b5-711f-4df5-a65b-4ca0c988350f unbound from our chassis
Feb 16 13:51:50 compute-1 systemd[1]: run-netns-ovnmeta\x2d25f604b5\x2d711f\x2d4df5\x2da65b\x2d4ca0c988350f.mount: Deactivated successfully.
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.464 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25f604b5-711f-4df5-a65b-4ca0c988350f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.464 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ad1e4c-b507-4eaf-8598-e2422aff3960]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.465 105573 INFO neutron.agent.ovn.metadata.agent [-] Port b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 in datapath 25f604b5-711f-4df5-a65b-4ca0c988350f unbound from our chassis
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.465 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25f604b5-711f-4df5-a65b-4ca0c988350f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:51:50 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:51:50.466 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[82f8b3fe-c003-4b38-8462-715101b0ca28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:51:51 compute-1 podman[216176]: 2026-02-16 13:51:51.929334891 +0000 UTC m=+0.073868092 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.334 185914 DEBUG nova.compute.manager [req-db3476ba-f645-4cd4-bb00-4ced19f9e651 req-d88acf82-7a4d-4415-b5b4-31c30acf3cda faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.335 185914 DEBUG oslo_concurrency.lockutils [req-db3476ba-f645-4cd4-bb00-4ced19f9e651 req-d88acf82-7a4d-4415-b5b4-31c30acf3cda faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.335 185914 DEBUG oslo_concurrency.lockutils [req-db3476ba-f645-4cd4-bb00-4ced19f9e651 req-d88acf82-7a4d-4415-b5b4-31c30acf3cda faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.335 185914 DEBUG oslo_concurrency.lockutils [req-db3476ba-f645-4cd4-bb00-4ced19f9e651 req-d88acf82-7a4d-4415-b5b4-31c30acf3cda faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.335 185914 DEBUG nova.compute.manager [req-db3476ba-f645-4cd4-bb00-4ced19f9e651 req-d88acf82-7a4d-4415-b5b4-31c30acf3cda faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.335 185914 DEBUG nova.compute.manager [req-db3476ba-f645-4cd4-bb00-4ced19f9e651 req-d88acf82-7a4d-4415-b5b4-31c30acf3cda faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-unplugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.503 185914 DEBUG nova.network.neutron [-] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.529 185914 INFO nova.compute.manager [-] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Took 3.07 seconds to deallocate network for instance.
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.575 185914 DEBUG oslo_concurrency.lockutils [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.575 185914 DEBUG oslo_concurrency.lockutils [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.581 185914 DEBUG oslo_concurrency.lockutils [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.610 185914 INFO nova.scheduler.client.report [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Deleted allocations for instance 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32
Feb 16 13:51:53 compute-1 nova_compute[185910]: 2026-02-16 13:51:53.678 185914 DEBUG oslo_concurrency.lockutils [None req-fa6d16f2-d532-412c-a130-401b1a9e5d17 c7c8dce27a2f4917a7dac485b1d8754a 5c4a5b3f08ab466eaac86305d91fd9a8 - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:54 compute-1 nova_compute[185910]: 2026-02-16 13:51:54.765 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:55 compute-1 nova_compute[185910]: 2026-02-16 13:51:55.396 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:51:55 compute-1 nova_compute[185910]: 2026-02-16 13:51:55.452 185914 DEBUG nova.compute.manager [req-2c56c58f-b2be-484a-b432-cc910f16ba08 req-c08f45c3-b3dd-41d5-a238-1bc38714179f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:55 compute-1 nova_compute[185910]: 2026-02-16 13:51:55.453 185914 DEBUG oslo_concurrency.lockutils [req-2c56c58f-b2be-484a-b432-cc910f16ba08 req-c08f45c3-b3dd-41d5-a238-1bc38714179f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:51:55 compute-1 nova_compute[185910]: 2026-02-16 13:51:55.453 185914 DEBUG oslo_concurrency.lockutils [req-2c56c58f-b2be-484a-b432-cc910f16ba08 req-c08f45c3-b3dd-41d5-a238-1bc38714179f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:51:55 compute-1 nova_compute[185910]: 2026-02-16 13:51:55.453 185914 DEBUG oslo_concurrency.lockutils [req-2c56c58f-b2be-484a-b432-cc910f16ba08 req-c08f45c3-b3dd-41d5-a238-1bc38714179f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "6b24deb5-a1f1-4154-a8a4-c31c69dc5d32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:51:55 compute-1 nova_compute[185910]: 2026-02-16 13:51:55.454 185914 DEBUG nova.compute.manager [req-2c56c58f-b2be-484a-b432-cc910f16ba08 req-c08f45c3-b3dd-41d5-a238-1bc38714179f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] No waiting events found dispatching network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:51:55 compute-1 nova_compute[185910]: 2026-02-16 13:51:55.454 185914 WARNING nova.compute.manager [req-2c56c58f-b2be-484a-b432-cc910f16ba08 req-c08f45c3-b3dd-41d5-a238-1bc38714179f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received unexpected event network-vif-plugged-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 for instance with vm_state deleted and task_state None.
Feb 16 13:51:55 compute-1 nova_compute[185910]: 2026-02-16 13:51:55.454 185914 DEBUG nova.compute.manager [req-2c56c58f-b2be-484a-b432-cc910f16ba08 req-c08f45c3-b3dd-41d5-a238-1bc38714179f faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Received event network-vif-deleted-b7a22eb4-a31d-4a96-a5a9-9e56f37cecc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:51:58 compute-1 podman[216202]: 2026-02-16 13:51:58.920899696 +0000 UTC m=+0.060923402 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:51:59 compute-1 nova_compute[185910]: 2026-02-16 13:51:59.767 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:00 compute-1 nova_compute[185910]: 2026-02-16 13:52:00.399 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:01 compute-1 anacron[60676]: Job `cron.weekly' started
Feb 16 13:52:01 compute-1 anacron[60676]: Job `cron.weekly' terminated
Feb 16 13:52:01 compute-1 nova_compute[185910]: 2026-02-16 13:52:01.891 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249906.8892798, e72c2fc7-4686-493c-ac27-4a865859dd3d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:52:01 compute-1 nova_compute[185910]: 2026-02-16 13:52:01.891 185914 INFO nova.compute.manager [-] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] VM Stopped (Lifecycle Event)
Feb 16 13:52:01 compute-1 nova_compute[185910]: 2026-02-16 13:52:01.914 185914 DEBUG nova.compute.manager [None req-0242755f-7bfd-43df-83c1-845b00de98bf - - - - - -] [instance: e72c2fc7-4686-493c-ac27-4a865859dd3d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:52:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:52:03.364 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:52:03.364 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:52:03.365 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:04 compute-1 nova_compute[185910]: 2026-02-16 13:52:04.769 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:05 compute-1 nova_compute[185910]: 2026-02-16 13:52:05.371 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771249910.36943, 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:52:05 compute-1 nova_compute[185910]: 2026-02-16 13:52:05.372 185914 INFO nova.compute.manager [-] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] VM Stopped (Lifecycle Event)
Feb 16 13:52:05 compute-1 nova_compute[185910]: 2026-02-16 13:52:05.404 185914 DEBUG nova.compute.manager [None req-b036a2f0-091e-4ce9-baa8-3fd04c05d2b3 - - - - - -] [instance: 6b24deb5-a1f1-4154-a8a4-c31c69dc5d32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:52:05 compute-1 nova_compute[185910]: 2026-02-16 13:52:05.436 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:05 compute-1 podman[195236]: time="2026-02-16T13:52:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:52:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:52:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:52:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:52:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Feb 16 13:52:09 compute-1 nova_compute[185910]: 2026-02-16 13:52:09.771 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:10 compute-1 nova_compute[185910]: 2026-02-16 13:52:10.440 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:13 compute-1 podman[216230]: 2026-02-16 13:52:13.918858352 +0000 UTC m=+0.052477946 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 16 13:52:13 compute-1 podman[216229]: 2026-02-16 13:52:13.932918121 +0000 UTC m=+0.065807815 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, release=1770267347, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Feb 16 13:52:14 compute-1 nova_compute[185910]: 2026-02-16 13:52:14.774 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:15 compute-1 nova_compute[185910]: 2026-02-16 13:52:15.284 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:15 compute-1 nova_compute[185910]: 2026-02-16 13:52:15.442 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:19 compute-1 openstack_network_exporter[198096]: ERROR   13:52:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:52:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:52:19 compute-1 openstack_network_exporter[198096]: ERROR   13:52:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:52:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:52:19 compute-1 nova_compute[185910]: 2026-02-16 13:52:19.777 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:20 compute-1 nova_compute[185910]: 2026-02-16 13:52:20.495 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:22 compute-1 podman[216270]: 2026-02-16 13:52:22.948573056 +0000 UTC m=+0.092273367 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 16 13:52:24 compute-1 nova_compute[185910]: 2026-02-16 13:52:24.778 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:25 compute-1 nova_compute[185910]: 2026-02-16 13:52:25.497 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:25 compute-1 nova_compute[185910]: 2026-02-16 13:52:25.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:28 compute-1 nova_compute[185910]: 2026-02-16 13:52:28.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:28 compute-1 nova_compute[185910]: 2026-02-16 13:52:28.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:29 compute-1 nova_compute[185910]: 2026-02-16 13:52:29.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:29 compute-1 nova_compute[185910]: 2026-02-16 13:52:29.781 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:29 compute-1 podman[216297]: 2026-02-16 13:52:29.922113127 +0000 UTC m=+0.065302552 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:52:30 compute-1 nova_compute[185910]: 2026-02-16 13:52:30.500 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:30 compute-1 nova_compute[185910]: 2026-02-16 13:52:30.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:30 compute-1 nova_compute[185910]: 2026-02-16 13:52:30.859 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:30 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:52:30.859 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:52:30 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:52:30.860 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.663 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.664 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.664 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.665 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.820 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.822 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5799MB free_disk=73.22308349609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.822 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.822 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.917 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.917 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:52:32 compute-1 nova_compute[185910]: 2026-02-16 13:52:32.945 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:52:33 compute-1 nova_compute[185910]: 2026-02-16 13:52:33.133 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:52:33 compute-1 nova_compute[185910]: 2026-02-16 13:52:33.174 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:52:33 compute-1 nova_compute[185910]: 2026-02-16 13:52:33.174 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:34 compute-1 nova_compute[185910]: 2026-02-16 13:52:34.782 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:35 compute-1 nova_compute[185910]: 2026-02-16 13:52:35.502 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:35 compute-1 podman[195236]: time="2026-02-16T13:52:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:52:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:52:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:52:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:52:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:52:36 compute-1 nova_compute[185910]: 2026-02-16 13:52:36.169 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:36 compute-1 nova_compute[185910]: 2026-02-16 13:52:36.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:36 compute-1 nova_compute[185910]: 2026-02-16 13:52:36.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:52:36 compute-1 nova_compute[185910]: 2026-02-16 13:52:36.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:52:36 compute-1 nova_compute[185910]: 2026-02-16 13:52:36.654 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:52:38 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:52:38.862 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:52:39 compute-1 nova_compute[185910]: 2026-02-16 13:52:39.829 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:40 compute-1 nova_compute[185910]: 2026-02-16 13:52:40.509 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:41 compute-1 nova_compute[185910]: 2026-02-16 13:52:41.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:41 compute-1 nova_compute[185910]: 2026-02-16 13:52:41.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:52:44 compute-1 sshd-session[216324]: Invalid user postgres from 188.166.42.159 port 35346
Feb 16 13:52:44 compute-1 podman[216327]: 2026-02-16 13:52:44.467512275 +0000 UTC m=+0.058866938 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 16 13:52:44 compute-1 podman[216326]: 2026-02-16 13:52:44.468197723 +0000 UTC m=+0.071214780 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9/ubi-minimal, vendor=Red Hat, Inc.)
Feb 16 13:52:44 compute-1 sshd-session[216324]: Connection closed by invalid user postgres 188.166.42.159 port 35346 [preauth]
Feb 16 13:52:44 compute-1 nova_compute[185910]: 2026-02-16 13:52:44.831 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:45 compute-1 nova_compute[185910]: 2026-02-16 13:52:45.511 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:49 compute-1 openstack_network_exporter[198096]: ERROR   13:52:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:52:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:52:49 compute-1 openstack_network_exporter[198096]: ERROR   13:52:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:52:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:52:49 compute-1 nova_compute[185910]: 2026-02-16 13:52:49.834 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:50 compute-1 ovn_controller[96285]: 2026-02-16T13:52:50Z|00210|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 13:52:50 compute-1 nova_compute[185910]: 2026-02-16 13:52:50.514 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.632 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.632 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.633 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.633 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.634 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.634 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.664 185914 DEBUG nova.virt.libvirt.imagecache [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.664 185914 WARNING nova.virt.libvirt.imagecache [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Unknown base file: /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.665 185914 INFO nova.virt.libvirt.imagecache [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Removable base files: /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.665 185914 INFO nova.virt.libvirt.imagecache [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.666 185914 DEBUG nova.virt.libvirt.imagecache [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.666 185914 DEBUG nova.virt.libvirt.imagecache [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Feb 16 13:52:53 compute-1 nova_compute[185910]: 2026-02-16 13:52:53.666 185914 DEBUG nova.virt.libvirt.imagecache [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Feb 16 13:52:53 compute-1 podman[216368]: 2026-02-16 13:52:53.937183568 +0000 UTC m=+0.080320316 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 16 13:52:54 compute-1 nova_compute[185910]: 2026-02-16 13:52:54.872 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:55 compute-1 nova_compute[185910]: 2026-02-16 13:52:55.516 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:52:56 compute-1 sshd-session[216395]: Invalid user test from 146.190.226.24 port 41278
Feb 16 13:52:56 compute-1 sshd-session[216395]: Connection closed by invalid user test 146.190.226.24 port 41278 [preauth]
Feb 16 13:52:59 compute-1 nova_compute[185910]: 2026-02-16 13:52:59.873 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:00 compute-1 nova_compute[185910]: 2026-02-16 13:53:00.519 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:00 compute-1 podman[216397]: 2026-02-16 13:53:00.910817441 +0000 UTC m=+0.054964722 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:53:03 compute-1 sshd-session[216422]: Connection closed by authenticating user root 2.57.122.210 port 48814 [preauth]
Feb 16 13:53:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:03.365 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:03.366 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:03.366 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:04 compute-1 nova_compute[185910]: 2026-02-16 13:53:04.926 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:05 compute-1 nova_compute[185910]: 2026-02-16 13:53:05.521 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:05 compute-1 podman[195236]: time="2026-02-16T13:53:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:53:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:53:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:53:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:53:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.572 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.573 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.590 185914 DEBUG nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.666 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.667 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.676 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.676 185914 INFO nova.compute.claims [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.783 185914 DEBUG nova.compute.provider_tree [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.798 185914 DEBUG nova.scheduler.client.report [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.821 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.822 185914 DEBUG nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.865 185914 DEBUG nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.866 185914 DEBUG nova.network.neutron [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.891 185914 INFO nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:53:06 compute-1 nova_compute[185910]: 2026-02-16 13:53:06.912 185914 DEBUG nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.064 185914 DEBUG nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.066 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.066 185914 INFO nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Creating image(s)
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.066 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.067 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.067 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.081 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.131 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.132 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.132 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.143 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.190 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.191 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.205 185914 DEBUG nova.policy [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '178de9ab917a4ba5a84dc9f520a0847f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.218 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.219 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.219 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.264 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.265 185914 DEBUG nova.virt.disk.api [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Checking if we can resize image /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.265 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.326 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.328 185914 DEBUG nova.virt.disk.api [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Cannot resize image /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.328 185914 DEBUG nova.objects.instance [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lazy-loading 'migration_context' on Instance uuid b81c5faa-2832-4df4-8db7-1ffb8d8209ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.346 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.346 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Ensure instance console log exists: /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.347 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.348 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.348 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:07 compute-1 nova_compute[185910]: 2026-02-16 13:53:07.888 185914 DEBUG nova.network.neutron [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Successfully created port: da0306b3-8514-4ef0-984c-14d90dedd285 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:53:09 compute-1 nova_compute[185910]: 2026-02-16 13:53:09.172 185914 DEBUG nova.network.neutron [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Successfully updated port: da0306b3-8514-4ef0-984c-14d90dedd285 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:53:09 compute-1 nova_compute[185910]: 2026-02-16 13:53:09.189 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:53:09 compute-1 nova_compute[185910]: 2026-02-16 13:53:09.190 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquired lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:53:09 compute-1 nova_compute[185910]: 2026-02-16 13:53:09.190 185914 DEBUG nova.network.neutron [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:53:09 compute-1 nova_compute[185910]: 2026-02-16 13:53:09.366 185914 DEBUG nova.compute.manager [req-5dfcdaca-0a1e-48f8-88b5-1f08df1cb6f8 req-a9d12eed-c6cc-4ab2-b805-d4d684ea079b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-changed-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:09 compute-1 nova_compute[185910]: 2026-02-16 13:53:09.366 185914 DEBUG nova.compute.manager [req-5dfcdaca-0a1e-48f8-88b5-1f08df1cb6f8 req-a9d12eed-c6cc-4ab2-b805-d4d684ea079b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Refreshing instance network info cache due to event network-changed-da0306b3-8514-4ef0-984c-14d90dedd285. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:53:09 compute-1 nova_compute[185910]: 2026-02-16 13:53:09.367 185914 DEBUG oslo_concurrency.lockutils [req-5dfcdaca-0a1e-48f8-88b5-1f08df1cb6f8 req-a9d12eed-c6cc-4ab2-b805-d4d684ea079b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:53:09 compute-1 nova_compute[185910]: 2026-02-16 13:53:09.408 185914 DEBUG nova.network.neutron [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:53:09 compute-1 nova_compute[185910]: 2026-02-16 13:53:09.929 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.179 185914 DEBUG nova.network.neutron [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updating instance_info_cache with network_info: [{"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.200 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Releasing lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.201 185914 DEBUG nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Instance network_info: |[{"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.202 185914 DEBUG oslo_concurrency.lockutils [req-5dfcdaca-0a1e-48f8-88b5-1f08df1cb6f8 req-a9d12eed-c6cc-4ab2-b805-d4d684ea079b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.202 185914 DEBUG nova.network.neutron [req-5dfcdaca-0a1e-48f8-88b5-1f08df1cb6f8 req-a9d12eed-c6cc-4ab2-b805-d4d684ea079b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Refreshing network info cache for port da0306b3-8514-4ef0-984c-14d90dedd285 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.207 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Start _get_guest_xml network_info=[{"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.213 185914 WARNING nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.218 185914 DEBUG nova.virt.libvirt.host [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.219 185914 DEBUG nova.virt.libvirt.host [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.232 185914 DEBUG nova.virt.libvirt.host [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.233 185914 DEBUG nova.virt.libvirt.host [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.235 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.236 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.236 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.236 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.237 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.237 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.237 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.237 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.237 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.238 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.238 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.238 185914 DEBUG nova.virt.hardware [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.242 185914 DEBUG nova.virt.libvirt.vif [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:53:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1419341349',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1419341349',id=28,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88d7e9d22dc247d4b0e2e95ecc7e73ad',ramdisk_id='',reservation_id='r-9nhfpgiy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:53:06Z,user_data=None,user_id='178de9ab917a4ba5a84dc9f520a0847f',uuid=b81c5faa-2832-4df4-8db7-1ffb8d8209ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.243 185914 DEBUG nova.network.os_vif_util [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converting VIF {"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.243 185914 DEBUG nova.network.os_vif_util [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.244 185914 DEBUG nova.objects.instance [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lazy-loading 'pci_devices' on Instance uuid b81c5faa-2832-4df4-8db7-1ffb8d8209ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.260 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <uuid>b81c5faa-2832-4df4-8db7-1ffb8d8209ab</uuid>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <name>instance-0000001c</name>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-1419341349</nova:name>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:53:10</nova:creationTime>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:53:10 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:53:10 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:53:10 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:53:10 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:53:10 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:53:10 compute-1 nova_compute[185910]:         <nova:user uuid="178de9ab917a4ba5a84dc9f520a0847f">tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member</nova:user>
Feb 16 13:53:10 compute-1 nova_compute[185910]:         <nova:project uuid="88d7e9d22dc247d4b0e2e95ecc7e73ad">tempest-TestExecuteWorkloadBalancingStrategy-492275053</nova:project>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:53:10 compute-1 nova_compute[185910]:         <nova:port uuid="da0306b3-8514-4ef0-984c-14d90dedd285">
Feb 16 13:53:10 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <system>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <entry name="serial">b81c5faa-2832-4df4-8db7-1ffb8d8209ab</entry>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <entry name="uuid">b81c5faa-2832-4df4-8db7-1ffb8d8209ab</entry>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     </system>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <os>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   </os>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <features>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   </features>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:04:49:ad"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <target dev="tapda0306b3-85"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/console.log" append="off"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <video>
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     </video>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:53:10 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:53:10 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:53:10 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:53:10 compute-1 nova_compute[185910]: </domain>
Feb 16 13:53:10 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.261 185914 DEBUG nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Preparing to wait for external event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.261 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.261 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.262 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.262 185914 DEBUG nova.virt.libvirt.vif [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:53:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1419341349',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1419341349',id=28,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='88d7e9d22dc247d4b0e2e95ecc7e73ad',ramdisk_id='',reservation_id='r-9nhfpgiy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:53:06Z,user_data=None,user_id='178de9ab917a4ba5a84dc9f520a0847f',uuid=b81c5faa-2832-4df4-8db7-1ffb8d8209ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.262 185914 DEBUG nova.network.os_vif_util [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converting VIF {"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.263 185914 DEBUG nova.network.os_vif_util [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.263 185914 DEBUG os_vif [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.264 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.264 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.264 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.267 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.267 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda0306b3-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.267 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapda0306b3-85, col_values=(('external_ids', {'iface-id': 'da0306b3-8514-4ef0-984c-14d90dedd285', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:49:ad', 'vm-uuid': 'b81c5faa-2832-4df4-8db7-1ffb8d8209ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.269 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-1 NetworkManager[56388]: <info>  [1771249990.2706] manager: (tapda0306b3-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.273 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.275 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.276 185914 INFO os_vif [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85')
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.322 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.322 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.323 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] No VIF found with MAC fa:16:3e:04:49:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.323 185914 INFO nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Using config drive
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.649 185914 INFO nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Creating config drive at /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.653 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcq4e6b_x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.774 185914 DEBUG oslo_concurrency.processutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcq4e6b_x" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:10 compute-1 kernel: tapda0306b3-85: entered promiscuous mode
Feb 16 13:53:10 compute-1 NetworkManager[56388]: <info>  [1771249990.8304] manager: (tapda0306b3-85): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.831 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-1 ovn_controller[96285]: 2026-02-16T13:53:10Z|00211|binding|INFO|Claiming lport da0306b3-8514-4ef0-984c-14d90dedd285 for this chassis.
Feb 16 13:53:10 compute-1 ovn_controller[96285]: 2026-02-16T13:53:10Z|00212|binding|INFO|da0306b3-8514-4ef0-984c-14d90dedd285: Claiming fa:16:3e:04:49:ad 10.100.0.5
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.837 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.847 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:49:ad 10.100.0.5'], port_security=['fa:16:3e:04:49:ad 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b81c5faa-2832-4df4-8db7-1ffb8d8209ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'neutron:revision_number': '2', 'neutron:security_group_ids': '412416b0-33c5-4b94-970c-86cdbe589da9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07efc4e8-a338-40bc-a1a5-892571713a01, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=da0306b3-8514-4ef0-984c-14d90dedd285) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.849 105573 INFO neutron.agent.ovn.metadata.agent [-] Port da0306b3-8514-4ef0-984c-14d90dedd285 in datapath 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c bound to our chassis
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.851 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.855 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-1 systemd-udevd[216459]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.858 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[4ac1c2ad-0db3-40af-8dc6-114dbd2591ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.859 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f3f30c5-b1 in ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:53:10 compute-1 systemd-machined[155419]: New machine qemu-18-instance-0000001c.
Feb 16 13:53:10 compute-1 ovn_controller[96285]: 2026-02-16T13:53:10Z|00213|binding|INFO|Setting lport da0306b3-8514-4ef0-984c-14d90dedd285 ovn-installed in OVS
Feb 16 13:53:10 compute-1 ovn_controller[96285]: 2026-02-16T13:53:10Z|00214|binding|INFO|Setting lport da0306b3-8514-4ef0-984c-14d90dedd285 up in Southbound
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.861 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f3f30c5-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.861 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[2edd4b03-1189-4335-adf2-c205d7fe2c31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 nova_compute[185910]: 2026-02-16 13:53:10.861 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.862 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb03eb5-2a86-49ba-bc02-8a9cf2d396ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 NetworkManager[56388]: <info>  [1771249990.8678] device (tapda0306b3-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:53:10 compute-1 NetworkManager[56388]: <info>  [1771249990.8683] device (tapda0306b3-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:53:10 compute-1 systemd[1]: Started Virtual Machine qemu-18-instance-0000001c.
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.872 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[9371a8f2-7989-425e-ad07-6aa2f8ea415d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.879 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1ee7c3-907b-4a79-9ce9-0502d076d72a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.907 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[c6eb912f-1a7f-426b-8b49-85dae77624e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.912 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[9c776f2a-7dca-4888-832a-e25fd27b97e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 systemd-udevd[216463]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:53:10 compute-1 NetworkManager[56388]: <info>  [1771249990.9136] manager: (tap9f3f30c5-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.935 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d18537-04ad-46d0-8224-1d37456ce1fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.939 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[42ae672e-2a7e-43a0-8569-1b4d8f9822a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 NetworkManager[56388]: <info>  [1771249990.9528] device (tap9f3f30c5-b0): carrier: link connected
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.955 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[b721cedd-eae9-4584-b491-b22eca095678]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.968 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c9e23e-f3f4-49aa-a5d0-049b41033146]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3f30c5-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:b8:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 607537, 'reachable_time': 36294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216492, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.978 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[12385056-8709-4e0d-8981-5c92aff5c7fd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:b836'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 607537, 'tstamp': 607537}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216493, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:10 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:10.987 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a5888a-e5c6-45a6-906d-c967bb790760]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f3f30c5-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:b8:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 607537, 'reachable_time': 36294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216494, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.007 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[7b2f718c-58f1-451e-b39d-c49f5c69a7bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.039 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[25048164-cdaa-44e2-bb4f-8dffb19c1f65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.041 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3f30c5-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.042 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.042 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f3f30c5-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.044 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:11 compute-1 NetworkManager[56388]: <info>  [1771249991.0446] manager: (tap9f3f30c5-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Feb 16 13:53:11 compute-1 kernel: tap9f3f30c5-b0: entered promiscuous mode
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.046 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.047 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f3f30c5-b0, col_values=(('external_ids', {'iface-id': '340fa0af-180b-44ed-9c22-e18a8f5ebdec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.047 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:11 compute-1 ovn_controller[96285]: 2026-02-16T13:53:11Z|00215|binding|INFO|Releasing lport 340fa0af-180b-44ed-9c22-e18a8f5ebdec from this chassis (sb_readonly=0)
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.048 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.048 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.049 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[fa10daa3-7769-4f4b-bdcf-738aad309013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.050 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c.pid.haproxy
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:53:11 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:11.051 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'env', 'PROCESS_TAG=haproxy-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.052 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.121 185914 DEBUG nova.compute.manager [req-3e3f0f45-74ad-45c2-9053-2acef72714bf req-f5c14394-286b-4088-bb41-fe813d8d1488 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.122 185914 DEBUG oslo_concurrency.lockutils [req-3e3f0f45-74ad-45c2-9053-2acef72714bf req-f5c14394-286b-4088-bb41-fe813d8d1488 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.122 185914 DEBUG oslo_concurrency.lockutils [req-3e3f0f45-74ad-45c2-9053-2acef72714bf req-f5c14394-286b-4088-bb41-fe813d8d1488 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.122 185914 DEBUG oslo_concurrency.lockutils [req-3e3f0f45-74ad-45c2-9053-2acef72714bf req-f5c14394-286b-4088-bb41-fe813d8d1488 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.122 185914 DEBUG nova.compute.manager [req-3e3f0f45-74ad-45c2-9053-2acef72714bf req-f5c14394-286b-4088-bb41-fe813d8d1488 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Processing event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:53:11 compute-1 podman[216526]: 2026-02-16 13:53:11.359573506 +0000 UTC m=+0.045170299 container create 0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.374 185914 DEBUG nova.network.neutron [req-5dfcdaca-0a1e-48f8-88b5-1f08df1cb6f8 req-a9d12eed-c6cc-4ab2-b805-d4d684ea079b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updated VIF entry in instance network info cache for port da0306b3-8514-4ef0-984c-14d90dedd285. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.374 185914 DEBUG nova.network.neutron [req-5dfcdaca-0a1e-48f8-88b5-1f08df1cb6f8 req-a9d12eed-c6cc-4ab2-b805-d4d684ea079b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updating instance_info_cache with network_info: [{"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:53:11 compute-1 systemd[1]: Started libpod-conmon-0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4.scope.
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.397 185914 DEBUG oslo_concurrency.lockutils [req-5dfcdaca-0a1e-48f8-88b5-1f08df1cb6f8 req-a9d12eed-c6cc-4ab2-b805-d4d684ea079b faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:53:11 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:53:11 compute-1 podman[216526]: 2026-02-16 13:53:11.335484576 +0000 UTC m=+0.021081389 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:53:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60d8caf46271ba596eaf96c4e3db19c82bf62f81bb280bb84e83780a44d83734/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:53:11 compute-1 podman[216526]: 2026-02-16 13:53:11.442203133 +0000 UTC m=+0.127799946 container init 0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 16 13:53:11 compute-1 podman[216526]: 2026-02-16 13:53:11.446594181 +0000 UTC m=+0.132190974 container start 0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 13:53:11 compute-1 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[216541]: [NOTICE]   (216545) : New worker (216547) forked
Feb 16 13:53:11 compute-1 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[216541]: [NOTICE]   (216545) : Loading success.
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.741 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249991.7403471, b81c5faa-2832-4df4-8db7-1ffb8d8209ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.742 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] VM Started (Lifecycle Event)
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.747 185914 DEBUG nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.752 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.758 185914 INFO nova.virt.libvirt.driver [-] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Instance spawned successfully.
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.759 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.782 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.785 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.795 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.795 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.795 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.796 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.796 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.796 185914 DEBUG nova.virt.libvirt.driver [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.834 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.834 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249991.7406955, b81c5faa-2832-4df4-8db7-1ffb8d8209ab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.834 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] VM Paused (Lifecycle Event)
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.860 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.864 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771249991.7514353, b81c5faa-2832-4df4-8db7-1ffb8d8209ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.864 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] VM Resumed (Lifecycle Event)
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.874 185914 INFO nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Took 4.81 seconds to spawn the instance on the hypervisor.
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.874 185914 DEBUG nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.904 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.909 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.956 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:53:11 compute-1 nova_compute[185910]: 2026-02-16 13:53:11.988 185914 INFO nova.compute.manager [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Took 5.35 seconds to build instance.
Feb 16 13:53:12 compute-1 nova_compute[185910]: 2026-02-16 13:53:12.003 185914 DEBUG oslo_concurrency.lockutils [None req-43bee644-c097-40de-b255-9ea0af8a1392 178de9ab917a4ba5a84dc9f520a0847f 88d7e9d22dc247d4b0e2e95ecc7e73ad - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:13 compute-1 nova_compute[185910]: 2026-02-16 13:53:13.248 185914 DEBUG nova.compute.manager [req-13bd45d0-0ead-472b-9147-f82c68fd1e68 req-34ce1c83-74bb-4c6f-9db5-b3f162a4b421 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:13 compute-1 nova_compute[185910]: 2026-02-16 13:53:13.249 185914 DEBUG oslo_concurrency.lockutils [req-13bd45d0-0ead-472b-9147-f82c68fd1e68 req-34ce1c83-74bb-4c6f-9db5-b3f162a4b421 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:13 compute-1 nova_compute[185910]: 2026-02-16 13:53:13.249 185914 DEBUG oslo_concurrency.lockutils [req-13bd45d0-0ead-472b-9147-f82c68fd1e68 req-34ce1c83-74bb-4c6f-9db5-b3f162a4b421 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:13 compute-1 nova_compute[185910]: 2026-02-16 13:53:13.250 185914 DEBUG oslo_concurrency.lockutils [req-13bd45d0-0ead-472b-9147-f82c68fd1e68 req-34ce1c83-74bb-4c6f-9db5-b3f162a4b421 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:13 compute-1 nova_compute[185910]: 2026-02-16 13:53:13.250 185914 DEBUG nova.compute.manager [req-13bd45d0-0ead-472b-9147-f82c68fd1e68 req-34ce1c83-74bb-4c6f-9db5-b3f162a4b421 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:13 compute-1 nova_compute[185910]: 2026-02-16 13:53:13.250 185914 WARNING nova.compute.manager [req-13bd45d0-0ead-472b-9147-f82c68fd1e68 req-34ce1c83-74bb-4c6f-9db5-b3f162a4b421 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received unexpected event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with vm_state active and task_state None.
Feb 16 13:53:14 compute-1 podman[216563]: 2026-02-16 13:53:14.927880269 +0000 UTC m=+0.068578960 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.buildah.version=1.33.7)
Feb 16 13:53:14 compute-1 nova_compute[185910]: 2026-02-16 13:53:14.978 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:14 compute-1 podman[216564]: 2026-02-16 13:53:14.982012198 +0000 UTC m=+0.120696525 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 13:53:15 compute-1 nova_compute[185910]: 2026-02-16 13:53:15.269 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:19 compute-1 openstack_network_exporter[198096]: ERROR   13:53:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:53:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:53:19 compute-1 openstack_network_exporter[198096]: ERROR   13:53:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:53:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:53:19 compute-1 nova_compute[185910]: 2026-02-16 13:53:19.981 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:20 compute-1 nova_compute[185910]: 2026-02-16 13:53:20.094 185914 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Check if temp file /var/lib/nova/instances/tmpq8uj_sv2 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:53:20 compute-1 nova_compute[185910]: 2026-02-16 13:53:20.095 185914 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq8uj_sv2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b81c5faa-2832-4df4-8db7-1ffb8d8209ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:53:20 compute-1 nova_compute[185910]: 2026-02-16 13:53:20.271 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:20 compute-1 nova_compute[185910]: 2026-02-16 13:53:20.631 185914 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:20 compute-1 nova_compute[185910]: 2026-02-16 13:53:20.688 185914 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:20 compute-1 nova_compute[185910]: 2026-02-16 13:53:20.690 185914 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:53:20 compute-1 nova_compute[185910]: 2026-02-16 13:53:20.738 185914 DEBUG oslo_concurrency.processutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:53:22 compute-1 sshd-session[216609]: Accepted publickey for nova from 192.168.122.100 port 48642 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:53:22 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:53:22 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:53:22 compute-1 systemd-logind[821]: New session 48 of user nova.
Feb 16 13:53:22 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:53:22 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:53:22 compute-1 systemd[216615]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:53:22 compute-1 systemd[216615]: Queued start job for default target Main User Target.
Feb 16 13:53:22 compute-1 systemd[216615]: Created slice User Application Slice.
Feb 16 13:53:22 compute-1 systemd[216615]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:53:22 compute-1 systemd[216615]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:53:22 compute-1 systemd[216615]: Reached target Paths.
Feb 16 13:53:22 compute-1 systemd[216615]: Reached target Timers.
Feb 16 13:53:22 compute-1 systemd[216615]: Starting D-Bus User Message Bus Socket...
Feb 16 13:53:22 compute-1 systemd[216615]: Starting Create User's Volatile Files and Directories...
Feb 16 13:53:22 compute-1 systemd[216615]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:53:22 compute-1 systemd[216615]: Reached target Sockets.
Feb 16 13:53:22 compute-1 systemd[216615]: Finished Create User's Volatile Files and Directories.
Feb 16 13:53:22 compute-1 systemd[216615]: Reached target Basic System.
Feb 16 13:53:22 compute-1 systemd[216615]: Reached target Main User Target.
Feb 16 13:53:22 compute-1 systemd[216615]: Startup finished in 126ms.
Feb 16 13:53:22 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:53:22 compute-1 systemd[1]: Started Session 48 of User nova.
Feb 16 13:53:22 compute-1 sshd-session[216609]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:53:22 compute-1 sshd-session[216632]: Received disconnect from 192.168.122.100 port 48642:11: disconnected by user
Feb 16 13:53:22 compute-1 sshd-session[216632]: Disconnected from user nova 192.168.122.100 port 48642
Feb 16 13:53:22 compute-1 sshd-session[216609]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:53:22 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Feb 16 13:53:22 compute-1 systemd-logind[821]: Session 48 logged out. Waiting for processes to exit.
Feb 16 13:53:22 compute-1 systemd-logind[821]: Removed session 48.
Feb 16 13:53:24 compute-1 ovn_controller[96285]: 2026-02-16T13:53:24Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:49:ad 10.100.0.5
Feb 16 13:53:24 compute-1 ovn_controller[96285]: 2026-02-16T13:53:24Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:49:ad 10.100.0.5
Feb 16 13:53:24 compute-1 nova_compute[185910]: 2026-02-16 13:53:24.992 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:25 compute-1 podman[216652]: 2026-02-16 13:53:25.014060441 +0000 UTC m=+0.146301924 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 13:53:25 compute-1 nova_compute[185910]: 2026-02-16 13:53:25.273 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:26 compute-1 nova_compute[185910]: 2026-02-16 13:53:26.322 185914 DEBUG nova.compute.manager [req-4ba4c61c-e842-4227-b95a-b8b32769f648 req-3f663c9a-3a77-4da4-9377-95ffb78cfb56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:26 compute-1 nova_compute[185910]: 2026-02-16 13:53:26.323 185914 DEBUG oslo_concurrency.lockutils [req-4ba4c61c-e842-4227-b95a-b8b32769f648 req-3f663c9a-3a77-4da4-9377-95ffb78cfb56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:26 compute-1 nova_compute[185910]: 2026-02-16 13:53:26.323 185914 DEBUG oslo_concurrency.lockutils [req-4ba4c61c-e842-4227-b95a-b8b32769f648 req-3f663c9a-3a77-4da4-9377-95ffb78cfb56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:26 compute-1 nova_compute[185910]: 2026-02-16 13:53:26.323 185914 DEBUG oslo_concurrency.lockutils [req-4ba4c61c-e842-4227-b95a-b8b32769f648 req-3f663c9a-3a77-4da4-9377-95ffb78cfb56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:26 compute-1 nova_compute[185910]: 2026-02-16 13:53:26.323 185914 DEBUG nova.compute.manager [req-4ba4c61c-e842-4227-b95a-b8b32769f648 req-3f663c9a-3a77-4da4-9377-95ffb78cfb56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:26 compute-1 nova_compute[185910]: 2026-02-16 13:53:26.324 185914 DEBUG nova.compute.manager [req-4ba4c61c-e842-4227-b95a-b8b32769f648 req-3f663c9a-3a77-4da4-9377-95ffb78cfb56 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.668 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.683 185914 INFO nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Took 6.94 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.684 185914 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.704 185914 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpq8uj_sv2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b81c5faa-2832-4df4-8db7-1ffb8d8209ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(23baa630-8720-4df0-ae64-9dd4c6785d6f),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.727 185914 DEBUG nova.objects.instance [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lazy-loading 'migration_context' on Instance uuid b81c5faa-2832-4df4-8db7-1ffb8d8209ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.728 185914 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.729 185914 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.729 185914 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.745 185914 DEBUG nova.virt.libvirt.vif [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:53:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1419341349',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1419341349',id=28,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:53:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='88d7e9d22dc247d4b0e2e95ecc7e73ad',ramdisk_id='',reservation_id='r-9nhfpgiy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:53:11Z,user_data=None,user_id='178de9ab917a4ba5a84dc9f520a0847f',uuid=b81c5faa-2832-4df4-8db7-1ffb8d8209ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.746 185914 DEBUG nova.network.os_vif_util [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converting VIF {"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.746 185914 DEBUG nova.network.os_vif_util [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.747 185914 DEBUG nova.virt.libvirt.migration [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:53:27 compute-1 nova_compute[185910]:   <mac address="fa:16:3e:04:49:ad"/>
Feb 16 13:53:27 compute-1 nova_compute[185910]:   <model type="virtio"/>
Feb 16 13:53:27 compute-1 nova_compute[185910]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:53:27 compute-1 nova_compute[185910]:   <mtu size="1442"/>
Feb 16 13:53:27 compute-1 nova_compute[185910]:   <target dev="tapda0306b3-85"/>
Feb 16 13:53:27 compute-1 nova_compute[185910]: </interface>
Feb 16 13:53:27 compute-1 nova_compute[185910]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:53:27 compute-1 nova_compute[185910]: 2026-02-16 13:53:27.747 185914 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.233 185914 DEBUG nova.virt.libvirt.migration [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.234 185914 INFO nova.virt.libvirt.migration [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.317 185914 INFO nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.429 185914 DEBUG nova.compute.manager [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.429 185914 DEBUG oslo_concurrency.lockutils [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.430 185914 DEBUG oslo_concurrency.lockutils [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.430 185914 DEBUG oslo_concurrency.lockutils [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.430 185914 DEBUG nova.compute.manager [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.430 185914 WARNING nova.compute.manager [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received unexpected event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with vm_state active and task_state migrating.
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.431 185914 DEBUG nova.compute.manager [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-changed-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.431 185914 DEBUG nova.compute.manager [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Refreshing instance network info cache due to event network-changed-da0306b3-8514-4ef0-984c-14d90dedd285. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.431 185914 DEBUG oslo_concurrency.lockutils [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.431 185914 DEBUG oslo_concurrency.lockutils [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.431 185914 DEBUG nova.network.neutron [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Refreshing network info cache for port da0306b3-8514-4ef0-984c-14d90dedd285 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.820 185914 DEBUG nova.virt.libvirt.migration [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:53:28 compute-1 nova_compute[185910]: 2026-02-16 13:53:28.821 185914 DEBUG nova.virt.libvirt.migration [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.194 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771250009.1937175, b81c5faa-2832-4df4-8db7-1ffb8d8209ab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.194 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] VM Paused (Lifecycle Event)
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.218 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.223 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.246 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:53:29 compute-1 kernel: tapda0306b3-85 (unregistering): left promiscuous mode
Feb 16 13:53:29 compute-1 NetworkManager[56388]: <info>  [1771250009.3160] device (tapda0306b3-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:53:29 compute-1 ovn_controller[96285]: 2026-02-16T13:53:29Z|00216|binding|INFO|Releasing lport da0306b3-8514-4ef0-984c-14d90dedd285 from this chassis (sb_readonly=0)
Feb 16 13:53:29 compute-1 ovn_controller[96285]: 2026-02-16T13:53:29Z|00217|binding|INFO|Setting lport da0306b3-8514-4ef0-984c-14d90dedd285 down in Southbound
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.323 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:29 compute-1 ovn_controller[96285]: 2026-02-16T13:53:29Z|00218|binding|INFO|Removing iface tapda0306b3-85 ovn-installed in OVS
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.326 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.333 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:49:ad 10.100.0.5'], port_security=['fa:16:3e:04:49:ad 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b81c5faa-2832-4df4-8db7-1ffb8d8209ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '88d7e9d22dc247d4b0e2e95ecc7e73ad', 'neutron:revision_number': '8', 'neutron:security_group_ids': '412416b0-33c5-4b94-970c-86cdbe589da9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07efc4e8-a338-40bc-a1a5-892571713a01, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=da0306b3-8514-4ef0-984c-14d90dedd285) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.335 105573 INFO neutron.agent.ovn.metadata.agent [-] Port da0306b3-8514-4ef0-984c-14d90dedd285 in datapath 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c unbound from our chassis
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.336 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.338 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.338 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[70258848-e756-4e8d-83de-70e3258d86b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.338 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c namespace which is not needed anymore
Feb 16 13:53:29 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Feb 16 13:53:29 compute-1 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001c.scope: Consumed 13.296s CPU time.
Feb 16 13:53:29 compute-1 systemd-machined[155419]: Machine qemu-18-instance-0000001c terminated.
Feb 16 13:53:29 compute-1 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[216541]: [NOTICE]   (216545) : haproxy version is 2.8.14-c23fe91
Feb 16 13:53:29 compute-1 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[216541]: [NOTICE]   (216545) : path to executable is /usr/sbin/haproxy
Feb 16 13:53:29 compute-1 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[216541]: [WARNING]  (216545) : Exiting Master process...
Feb 16 13:53:29 compute-1 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[216541]: [ALERT]    (216545) : Current worker (216547) exited with code 143 (Terminated)
Feb 16 13:53:29 compute-1 neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c[216541]: [WARNING]  (216545) : All workers exited. Exiting... (0)
Feb 16 13:53:29 compute-1 systemd[1]: libpod-0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4.scope: Deactivated successfully.
Feb 16 13:53:29 compute-1 podman[216708]: 2026-02-16 13:53:29.497539452 +0000 UTC m=+0.053538534 container died 0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.516 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.522 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:29 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4-userdata-shm.mount: Deactivated successfully.
Feb 16 13:53:29 compute-1 systemd[1]: var-lib-containers-storage-overlay-60d8caf46271ba596eaf96c4e3db19c82bf62f81bb280bb84e83780a44d83734-merged.mount: Deactivated successfully.
Feb 16 13:53:29 compute-1 podman[216708]: 2026-02-16 13:53:29.539172254 +0000 UTC m=+0.095171336 container cleanup 0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:53:29 compute-1 systemd[1]: libpod-conmon-0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4.scope: Deactivated successfully.
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.547 185914 DEBUG nova.compute.manager [req-21c2ae08-0f77-45de-8b04-73264edf71e9 req-80c4a26f-5532-4af2-a063-944ba81d9dcb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.548 185914 DEBUG oslo_concurrency.lockutils [req-21c2ae08-0f77-45de-8b04-73264edf71e9 req-80c4a26f-5532-4af2-a063-944ba81d9dcb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.548 185914 DEBUG oslo_concurrency.lockutils [req-21c2ae08-0f77-45de-8b04-73264edf71e9 req-80c4a26f-5532-4af2-a063-944ba81d9dcb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.549 185914 DEBUG oslo_concurrency.lockutils [req-21c2ae08-0f77-45de-8b04-73264edf71e9 req-80c4a26f-5532-4af2-a063-944ba81d9dcb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.549 185914 DEBUG nova.compute.manager [req-21c2ae08-0f77-45de-8b04-73264edf71e9 req-80c4a26f-5532-4af2-a063-944ba81d9dcb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.549 185914 DEBUG nova.compute.manager [req-21c2ae08-0f77-45de-8b04-73264edf71e9 req-80c4a26f-5532-4af2-a063-944ba81d9dcb faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.558 185914 DEBUG nova.virt.libvirt.guest [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.558 185914 INFO nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Migration operation has completed
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.558 185914 INFO nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] _post_live_migration() is started..
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.565 185914 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.565 185914 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.566 185914 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:53:29 compute-1 podman[216748]: 2026-02-16 13:53:29.594868035 +0000 UTC m=+0.037705847 container remove 0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.599 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a895cdc3-dd36-41e6-9b3b-ab5eb033bb42]: (4, ('Mon Feb 16 01:53:29 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c (0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4)\n0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4\nMon Feb 16 01:53:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c (0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4)\n0ee17207c557f2538c424b162b3e48b099d4bd1b2457901c0f6af0666e7ae5a4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.602 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ab8b6c8f-3225-42a9-9514-9918c4a7d76a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.603 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f3f30c5-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:29 compute-1 kernel: tap9f3f30c5-b0: left promiscuous mode
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.606 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.612 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.615 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[953263cc-104d-41db-a87f-301e91c0675d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.640 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[661c0ba2-ebcd-4f1a-a3d8-ae760b152672]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.642 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[0a5b3c8c-b509-464a-a944-bc853be320f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.654 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[ea42f78f-5871-4f3b-9bf8-4795784eb035]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 607532, 'reachable_time': 42945, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216768, 'error': None, 'target': 'ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:29 compute-1 systemd[1]: run-netns-ovnmeta\x2d9f3f30c5\x2db9b2\x2d44c9\x2dbea9\x2d678f6d4e1e0c.mount: Deactivated successfully.
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.658 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:53:29 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:29.658 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[78146286-16f6-4192-b2a6-76edc78fb11d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:53:29 compute-1 nova_compute[185910]: 2026-02-16 13:53:29.995 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:30 compute-1 nova_compute[185910]: 2026-02-16 13:53:30.276 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:30 compute-1 nova_compute[185910]: 2026-02-16 13:53:30.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:30 compute-1 nova_compute[185910]: 2026-02-16 13:53:30.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.244 185914 DEBUG nova.network.neutron [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updated VIF entry in instance network info cache for port da0306b3-8514-4ef0-984c-14d90dedd285. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.245 185914 DEBUG nova.network.neutron [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updating instance_info_cache with network_info: [{"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.284 185914 DEBUG oslo_concurrency.lockutils [req-1e4aff13-89e3-4341-bbed-0c39142746c4 req-6067d7b3-5444-4c7e-8ec7-6fa536468cfc faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-b81c5faa-2832-4df4-8db7-1ffb8d8209ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:53:31 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:31.293 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:53:31 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:31.294 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:53:31 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:53:31.296 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.334 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.394 185914 DEBUG nova.network.neutron [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Activated binding for port da0306b3-8514-4ef0-984c-14d90dedd285 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.395 185914 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.396 185914 DEBUG nova.virt.libvirt.vif [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:53:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-1419341349',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-1419341349',id=28,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:53:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='88d7e9d22dc247d4b0e2e95ecc7e73ad',ramdisk_id='',reservation_id='r-9nhfpgiy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-492275053-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:53:18Z,user_data=None,user_id='178de9ab917a4ba5a84dc9f520a0847f',uuid=b81c5faa-2832-4df4-8db7-1ffb8d8209ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.397 185914 DEBUG nova.network.os_vif_util [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converting VIF {"id": "da0306b3-8514-4ef0-984c-14d90dedd285", "address": "fa:16:3e:04:49:ad", "network": {"id": "9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-2080676524-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "88d7e9d22dc247d4b0e2e95ecc7e73ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda0306b3-85", "ovs_interfaceid": "da0306b3-8514-4ef0-984c-14d90dedd285", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.398 185914 DEBUG nova.network.os_vif_util [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.398 185914 DEBUG os_vif [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.402 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.403 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda0306b3-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.405 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.408 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.412 185914 INFO os_vif [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:49:ad,bridge_name='br-int',has_traffic_filtering=True,id=da0306b3-8514-4ef0-984c-14d90dedd285,network=Network(9f3f30c5-b9b2-44c9-bea9-678f6d4e1e0c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda0306b3-85')
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.412 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.412 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.413 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.413 185914 DEBUG nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.413 185914 INFO nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Deleting instance files /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab_del
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.414 185914 INFO nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Deletion of /var/lib/nova/instances/b81c5faa-2832-4df4-8db7-1ffb8d8209ab_del complete
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.786 185914 DEBUG nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.787 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.787 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.787 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.787 185914 DEBUG nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.788 185914 WARNING nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received unexpected event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with vm_state active and task_state migrating.
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.788 185914 DEBUG nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.788 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.788 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.789 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.789 185914 DEBUG nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.789 185914 WARNING nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received unexpected event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with vm_state active and task_state migrating.
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.789 185914 DEBUG nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.790 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.790 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.790 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.790 185914 DEBUG nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.791 185914 DEBUG nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-unplugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.791 185914 DEBUG nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.791 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.791 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.791 185914 DEBUG oslo_concurrency.lockutils [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.792 185914 DEBUG nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:31 compute-1 nova_compute[185910]: 2026-02-16 13:53:31.792 185914 WARNING nova.compute.manager [req-ca002dbf-2626-4dde-b842-28bea86e5f8b req-4b0bda1a-ae25-45a0-99b1-a47364054447 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received unexpected event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with vm_state active and task_state migrating.
Feb 16 13:53:31 compute-1 podman[216769]: 2026-02-16 13:53:31.915301763 +0000 UTC m=+0.054739076 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:53:33 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:53:33 compute-1 systemd[216615]: Activating special unit Exit the Session...
Feb 16 13:53:33 compute-1 systemd[216615]: Stopped target Main User Target.
Feb 16 13:53:33 compute-1 systemd[216615]: Stopped target Basic System.
Feb 16 13:53:33 compute-1 systemd[216615]: Stopped target Paths.
Feb 16 13:53:33 compute-1 systemd[216615]: Stopped target Sockets.
Feb 16 13:53:33 compute-1 systemd[216615]: Stopped target Timers.
Feb 16 13:53:33 compute-1 systemd[216615]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:53:33 compute-1 systemd[216615]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:53:33 compute-1 systemd[216615]: Closed D-Bus User Message Bus Socket.
Feb 16 13:53:33 compute-1 systemd[216615]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:53:33 compute-1 systemd[216615]: Removed slice User Application Slice.
Feb 16 13:53:33 compute-1 systemd[216615]: Reached target Shutdown.
Feb 16 13:53:33 compute-1 systemd[216615]: Finished Exit the Session.
Feb 16 13:53:33 compute-1 systemd[216615]: Reached target Exit the Session.
Feb 16 13:53:33 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:53:33 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:53:33 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:53:33 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:53:33 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:53:33 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:53:33 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:53:33 compute-1 nova_compute[185910]: 2026-02-16 13:53:33.878 185914 DEBUG nova.compute.manager [req-6688fd1e-1ef6-473b-805a-a7fb08dc95db req-4b50f1e6-3a06-4c24-b61b-504b97e3f2ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:53:33 compute-1 nova_compute[185910]: 2026-02-16 13:53:33.878 185914 DEBUG oslo_concurrency.lockutils [req-6688fd1e-1ef6-473b-805a-a7fb08dc95db req-4b50f1e6-3a06-4c24-b61b-504b97e3f2ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:33 compute-1 nova_compute[185910]: 2026-02-16 13:53:33.878 185914 DEBUG oslo_concurrency.lockutils [req-6688fd1e-1ef6-473b-805a-a7fb08dc95db req-4b50f1e6-3a06-4c24-b61b-504b97e3f2ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:33 compute-1 nova_compute[185910]: 2026-02-16 13:53:33.879 185914 DEBUG oslo_concurrency.lockutils [req-6688fd1e-1ef6-473b-805a-a7fb08dc95db req-4b50f1e6-3a06-4c24-b61b-504b97e3f2ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:33 compute-1 nova_compute[185910]: 2026-02-16 13:53:33.879 185914 DEBUG nova.compute.manager [req-6688fd1e-1ef6-473b-805a-a7fb08dc95db req-4b50f1e6-3a06-4c24-b61b-504b97e3f2ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] No waiting events found dispatching network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:53:33 compute-1 nova_compute[185910]: 2026-02-16 13:53:33.879 185914 WARNING nova.compute.manager [req-6688fd1e-1ef6-473b-805a-a7fb08dc95db req-4b50f1e6-3a06-4c24-b61b-504b97e3f2ea faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Received unexpected event network-vif-plugged-da0306b3-8514-4ef0-984c-14d90dedd285 for instance with vm_state active and task_state migrating.
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.657 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.658 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.658 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.658 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.780 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.781 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5781MB free_disk=73.22306823730469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.781 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.782 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.823 185914 INFO nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Updating resource usage from migration 23baa630-8720-4df0-ae64-9dd4c6785d6f
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.858 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Migration 23baa630-8720-4df0-ae64-9dd4c6785d6f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.858 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.858 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.916 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.931 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.958 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.959 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:34 compute-1 nova_compute[185910]: 2026-02-16 13:53:34.998 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:35 compute-1 podman[195236]: time="2026-02-16T13:53:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:53:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:53:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:53:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:53:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 13:53:35 compute-1 nova_compute[185910]: 2026-02-16 13:53:35.953 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:36 compute-1 nova_compute[185910]: 2026-02-16 13:53:36.406 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:37 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 13:53:37 compute-1 nova_compute[185910]: 2026-02-16 13:53:37.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:37 compute-1 nova_compute[185910]: 2026-02-16 13:53:37.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:53:37 compute-1 nova_compute[185910]: 2026-02-16 13:53:37.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:53:37 compute-1 nova_compute[185910]: 2026-02-16 13:53:37.652 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:53:37 compute-1 nova_compute[185910]: 2026-02-16 13:53:37.653 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:37 compute-1 sshd-session[216795]: Invalid user postgres from 188.166.42.159 port 54758
Feb 16 13:53:38 compute-1 sshd-session[216795]: Connection closed by invalid user postgres 188.166.42.159 port 54758 [preauth]
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.200 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.201 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.201 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "b81c5faa-2832-4df4-8db7-1ffb8d8209ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.234 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.235 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.235 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.235 185914 DEBUG nova.compute.resource_tracker [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.401 185914 WARNING nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.403 185914 DEBUG nova.compute.resource_tracker [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5785MB free_disk=73.22306823730469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.403 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.403 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.446 185914 DEBUG nova.compute.resource_tracker [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Migration for instance b81c5faa-2832-4df4-8db7-1ffb8d8209ab refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.467 185914 DEBUG nova.compute.resource_tracker [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.503 185914 DEBUG nova.compute.resource_tracker [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Migration 23baa630-8720-4df0-ae64-9dd4c6785d6f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.504 185914 DEBUG nova.compute.resource_tracker [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.504 185914 DEBUG nova.compute.resource_tracker [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.552 185914 DEBUG nova.compute.provider_tree [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.576 185914 DEBUG nova.scheduler.client.report [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.602 185914 DEBUG nova.compute.resource_tracker [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.602 185914 DEBUG oslo_concurrency.lockutils [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.606 185914 INFO nova.compute.manager [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.689 185914 INFO nova.scheduler.client.report [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] Deleted allocation for migration 23baa630-8720-4df0-ae64-9dd4c6785d6f
Feb 16 13:53:38 compute-1 nova_compute[185910]: 2026-02-16 13:53:38.690 185914 DEBUG nova.virt.libvirt.driver [None req-719ce552-7116-4f20-91ab-02a18f61571d d91e60f0222748ae9c16414a64f4e863 fa8ec31824694513a42cc22c81880a5c - - default default] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:53:40 compute-1 nova_compute[185910]: 2026-02-16 13:53:39.999 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:41 compute-1 nova_compute[185910]: 2026-02-16 13:53:41.409 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:41 compute-1 nova_compute[185910]: 2026-02-16 13:53:41.672 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:43 compute-1 nova_compute[185910]: 2026-02-16 13:53:43.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:43 compute-1 nova_compute[185910]: 2026-02-16 13:53:43.633 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:53:44 compute-1 nova_compute[185910]: 2026-02-16 13:53:44.560 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771250009.557857, b81c5faa-2832-4df4-8db7-1ffb8d8209ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:53:44 compute-1 nova_compute[185910]: 2026-02-16 13:53:44.560 185914 INFO nova.compute.manager [-] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] VM Stopped (Lifecycle Event)
Feb 16 13:53:44 compute-1 nova_compute[185910]: 2026-02-16 13:53:44.583 185914 DEBUG nova.compute.manager [None req-ad4e1220-a7df-4201-b580-258baa24d574 - - - - - -] [instance: b81c5faa-2832-4df4-8db7-1ffb8d8209ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:53:45 compute-1 nova_compute[185910]: 2026-02-16 13:53:45.038 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:45 compute-1 podman[216799]: 2026-02-16 13:53:45.913159312 +0000 UTC m=+0.052451135 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 13:53:45 compute-1 podman[216798]: 2026-02-16 13:53:45.925419603 +0000 UTC m=+0.068922899 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1770267347, distribution-scope=public, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., version=9.7, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter)
Feb 16 13:53:46 compute-1 nova_compute[185910]: 2026-02-16 13:53:46.413 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:47 compute-1 nova_compute[185910]: 2026-02-16 13:53:47.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:47 compute-1 nova_compute[185910]: 2026-02-16 13:53:47.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 13:53:48 compute-1 nova_compute[185910]: 2026-02-16 13:53:48.645 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:53:48 compute-1 nova_compute[185910]: 2026-02-16 13:53:48.645 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 13:53:48 compute-1 nova_compute[185910]: 2026-02-16 13:53:48.658 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 13:53:49 compute-1 openstack_network_exporter[198096]: ERROR   13:53:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:53:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:53:49 compute-1 openstack_network_exporter[198096]: ERROR   13:53:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:53:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:53:50 compute-1 nova_compute[185910]: 2026-02-16 13:53:50.044 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:51 compute-1 nova_compute[185910]: 2026-02-16 13:53:51.416 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:55 compute-1 nova_compute[185910]: 2026-02-16 13:53:55.047 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:53:55 compute-1 podman[216838]: 2026-02-16 13:53:55.963116496 +0000 UTC m=+0.101486856 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:53:56 compute-1 nova_compute[185910]: 2026-02-16 13:53:56.475 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:00 compute-1 nova_compute[185910]: 2026-02-16 13:54:00.049 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:01 compute-1 nova_compute[185910]: 2026-02-16 13:54:01.479 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:02 compute-1 podman[216864]: 2026-02-16 13:54:02.915845046 +0000 UTC m=+0.053853753 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 13:54:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:54:03.367 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:54:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:54:03.368 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:54:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:54:03.368 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:54:04 compute-1 sshd-session[216889]: Invalid user test from 146.190.226.24 port 37954
Feb 16 13:54:04 compute-1 sshd-session[216889]: Connection closed by invalid user test 146.190.226.24 port 37954 [preauth]
Feb 16 13:54:05 compute-1 nova_compute[185910]: 2026-02-16 13:54:05.052 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:05 compute-1 podman[195236]: time="2026-02-16T13:54:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:54:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:54:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:54:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:54:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:54:06 compute-1 nova_compute[185910]: 2026-02-16 13:54:06.483 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:10 compute-1 nova_compute[185910]: 2026-02-16 13:54:10.055 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:11 compute-1 nova_compute[185910]: 2026-02-16 13:54:11.502 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:15 compute-1 nova_compute[185910]: 2026-02-16 13:54:15.109 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:16 compute-1 nova_compute[185910]: 2026-02-16 13:54:16.504 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:16 compute-1 podman[216892]: 2026-02-16 13:54:16.928915266 +0000 UTC m=+0.066334998 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, architecture=x86_64)
Feb 16 13:54:16 compute-1 podman[216893]: 2026-02-16 13:54:16.92906367 +0000 UTC m=+0.058832666 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 13:54:18 compute-1 ovn_controller[96285]: 2026-02-16T13:54:18Z|00219|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 ovsdb.go:87: Transact: context deadline exceeded
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 collector.go:244: OvsdbGet(vswitch): context deadline exceeded
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 ovsdb.go:123: Transact: context deadline exceeded
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 collector.go:47: db.List(Bridge): context deadline exceeded
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 ovsdb.go:87: Transact: context deadline exceeded
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 collector.go:45: OvsdbGet(vswitch): context deadline exceeded
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 ovsdb.go:123: Transact: context deadline exceeded
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 collector.go:43: db.List(Bridge): context deadline exceeded
Feb 16 13:54:20 compute-1 nova_compute[185910]: 2026-02-16 13:54:20.918 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: ERROR   13:54:20 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:54:20 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:54:21 compute-1 nova_compute[185910]: 2026-02-16 13:54:21.506 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:25 compute-1 nova_compute[185910]: 2026-02-16 13:54:25.920 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:26 compute-1 nova_compute[185910]: 2026-02-16 13:54:26.508 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:26 compute-1 nova_compute[185910]: 2026-02-16 13:54:26.910 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:26 compute-1 podman[216929]: 2026-02-16 13:54:26.930781383 +0000 UTC m=+0.074122589 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 16 13:54:27 compute-1 nova_compute[185910]: 2026-02-16 13:54:27.655 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:29 compute-1 nova_compute[185910]: 2026-02-16 13:54:29.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:29 compute-1 nova_compute[185910]: 2026-02-16 13:54:29.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:30 compute-1 sshd-session[216955]: Invalid user postgres from 188.166.42.159 port 46274
Feb 16 13:54:30 compute-1 nova_compute[185910]: 2026-02-16 13:54:30.923 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:31 compute-1 sshd-session[216955]: Connection closed by invalid user postgres 188.166.42.159 port 46274 [preauth]
Feb 16 13:54:31 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:54:31.442 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:54:31 compute-1 nova_compute[185910]: 2026-02-16 13:54:31.442 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:31 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:54:31.443 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:54:31 compute-1 nova_compute[185910]: 2026-02-16 13:54:31.510 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:31 compute-1 nova_compute[185910]: 2026-02-16 13:54:31.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:31 compute-1 nova_compute[185910]: 2026-02-16 13:54:31.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:32 compute-1 nova_compute[185910]: 2026-02-16 13:54:32.586 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:33 compute-1 podman[216957]: 2026-02-16 13:54:33.918733761 +0000 UTC m=+0.058339583 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:54:34 compute-1 nova_compute[185910]: 2026-02-16 13:54:34.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:35 compute-1 podman[195236]: time="2026-02-16T13:54:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:54:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:54:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:54:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:54:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Feb 16 13:54:35 compute-1 nova_compute[185910]: 2026-02-16 13:54:35.924 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.015 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.016 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.016 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.016 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.140 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.141 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5811MB free_disk=73.22306823730469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.141 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.142 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.512 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.681 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.682 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.730 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.814 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.816 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:54:36 compute-1 nova_compute[185910]: 2026-02-16 13:54:36.816 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.675s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:54:37 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:54:37.445 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:54:39 compute-1 nova_compute[185910]: 2026-02-16 13:54:39.810 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:39 compute-1 nova_compute[185910]: 2026-02-16 13:54:39.811 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:39 compute-1 nova_compute[185910]: 2026-02-16 13:54:39.811 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:54:39 compute-1 nova_compute[185910]: 2026-02-16 13:54:39.811 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:54:40 compute-1 nova_compute[185910]: 2026-02-16 13:54:40.926 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:41 compute-1 nova_compute[185910]: 2026-02-16 13:54:41.515 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:41 compute-1 nova_compute[185910]: 2026-02-16 13:54:41.707 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:54:45 compute-1 nova_compute[185910]: 2026-02-16 13:54:45.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:54:45 compute-1 nova_compute[185910]: 2026-02-16 13:54:45.635 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:54:45 compute-1 nova_compute[185910]: 2026-02-16 13:54:45.928 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:46 compute-1 nova_compute[185910]: 2026-02-16 13:54:46.558 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:47 compute-1 podman[216984]: 2026-02-16 13:54:47.915745918 +0000 UTC m=+0.051397017 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 16 13:54:47 compute-1 podman[216983]: 2026-02-16 13:54:47.916604761 +0000 UTC m=+0.055753574 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc.)
Feb 16 13:54:49 compute-1 openstack_network_exporter[198096]: ERROR   13:54:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:54:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:54:49 compute-1 openstack_network_exporter[198096]: ERROR   13:54:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:54:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:54:50 compute-1 nova_compute[185910]: 2026-02-16 13:54:50.929 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:51 compute-1 nova_compute[185910]: 2026-02-16 13:54:51.561 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:55 compute-1 nova_compute[185910]: 2026-02-16 13:54:55.931 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:56 compute-1 nova_compute[185910]: 2026-02-16 13:54:56.593 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:54:57 compute-1 podman[217024]: 2026-02-16 13:54:57.956156747 +0000 UTC m=+0.100547082 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Feb 16 13:55:00 compute-1 nova_compute[185910]: 2026-02-16 13:55:00.933 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:01 compute-1 nova_compute[185910]: 2026-02-16 13:55:01.633 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:03.368 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:03.369 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:03.369 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:04 compute-1 podman[217050]: 2026-02-16 13:55:04.913455709 +0000 UTC m=+0.049426782 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:55:05 compute-1 podman[195236]: time="2026-02-16T13:55:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:55:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:55:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:55:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:55:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:55:05 compute-1 nova_compute[185910]: 2026-02-16 13:55:05.935 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:06 compute-1 nova_compute[185910]: 2026-02-16 13:55:06.677 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:10 compute-1 sshd-session[217076]: Invalid user test from 146.190.226.24 port 33390
Feb 16 13:55:10 compute-1 sshd-session[217076]: Connection closed by invalid user test 146.190.226.24 port 33390 [preauth]
Feb 16 13:55:10 compute-1 nova_compute[185910]: 2026-02-16 13:55:10.937 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:11 compute-1 nova_compute[185910]: 2026-02-16 13:55:11.679 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:15 compute-1 nova_compute[185910]: 2026-02-16 13:55:15.941 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:16 compute-1 nova_compute[185910]: 2026-02-16 13:55:16.682 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:19 compute-1 openstack_network_exporter[198096]: ERROR   13:55:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:55:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:55:19 compute-1 openstack_network_exporter[198096]: ERROR   13:55:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:55:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:55:19 compute-1 ovn_controller[96285]: 2026-02-16T13:55:19Z|00220|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 13:55:19 compute-1 podman[217079]: 2026-02-16 13:55:19.694463006 +0000 UTC m=+0.054228583 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:55:19 compute-1 podman[217078]: 2026-02-16 13:55:19.720977481 +0000 UTC m=+0.087078208 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, architecture=x86_64, version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347)
Feb 16 13:55:20 compute-1 nova_compute[185910]: 2026-02-16 13:55:20.942 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:21 compute-1 nova_compute[185910]: 2026-02-16 13:55:21.684 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:23 compute-1 sshd-session[217119]: Invalid user postgres from 188.166.42.159 port 39832
Feb 16 13:55:24 compute-1 sshd-session[217119]: Connection closed by invalid user postgres 188.166.42.159 port 39832 [preauth]
Feb 16 13:55:24 compute-1 sshd-session[217121]: Invalid user ubuntu from 2.57.122.210 port 51524
Feb 16 13:55:24 compute-1 sshd-session[217121]: Connection closed by invalid user ubuntu 2.57.122.210 port 51524 [preauth]
Feb 16 13:55:25 compute-1 nova_compute[185910]: 2026-02-16 13:55:25.944 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:26 compute-1 nova_compute[185910]: 2026-02-16 13:55:26.733 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:28 compute-1 podman[217123]: 2026-02-16 13:55:28.954028445 +0000 UTC m=+0.099823892 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 16 13:55:29 compute-1 nova_compute[185910]: 2026-02-16 13:55:29.633 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:29 compute-1 nova_compute[185910]: 2026-02-16 13:55:29.634 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:29 compute-1 nova_compute[185910]: 2026-02-16 13:55:29.634 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:30 compute-1 nova_compute[185910]: 2026-02-16 13:55:30.946 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:31 compute-1 nova_compute[185910]: 2026-02-16 13:55:31.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:31 compute-1 nova_compute[185910]: 2026-02-16 13:55:31.763 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:33 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:33.309 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:55:33 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:33.310 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:55:33 compute-1 nova_compute[185910]: 2026-02-16 13:55:33.310 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:33 compute-1 nova_compute[185910]: 2026-02-16 13:55:33.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:35 compute-1 podman[195236]: time="2026-02-16T13:55:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:55:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:55:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:55:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:55:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 16 13:55:35 compute-1 podman[217149]: 2026-02-16 13:55:35.911307777 +0000 UTC m=+0.050110142 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:55:35 compute-1 nova_compute[185910]: 2026-02-16 13:55:35.948 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.676 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.676 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.677 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.677 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.766 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.821 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.822 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5831MB free_disk=73.22298812866211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.822 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:36 compute-1 nova_compute[185910]: 2026-02-16 13:55:36.822 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:37 compute-1 nova_compute[185910]: 2026-02-16 13:55:37.199 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:55:37 compute-1 nova_compute[185910]: 2026-02-16 13:55:37.199 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:55:37 compute-1 nova_compute[185910]: 2026-02-16 13:55:37.228 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:55:37 compute-1 nova_compute[185910]: 2026-02-16 13:55:37.251 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:55:37 compute-1 nova_compute[185910]: 2026-02-16 13:55:37.253 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:55:37 compute-1 nova_compute[185910]: 2026-02-16 13:55:37.253 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:39 compute-1 nova_compute[185910]: 2026-02-16 13:55:39.249 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:40 compute-1 nova_compute[185910]: 2026-02-16 13:55:40.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:40 compute-1 nova_compute[185910]: 2026-02-16 13:55:40.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:55:40 compute-1 nova_compute[185910]: 2026-02-16 13:55:40.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:55:40 compute-1 nova_compute[185910]: 2026-02-16 13:55:40.666 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:55:40 compute-1 nova_compute[185910]: 2026-02-16 13:55:40.950 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:41 compute-1 nova_compute[185910]: 2026-02-16 13:55:41.768 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:42 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:42.314 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:55:42 compute-1 nova_compute[185910]: 2026-02-16 13:55:42.661 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.578 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.579 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.615 185914 DEBUG nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.718 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.719 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.725 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.725 185914 INFO nova.compute.claims [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Claim successful on node compute-1.ctlplane.example.com
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.883 185914 DEBUG nova.compute.provider_tree [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.920 185914 DEBUG nova.scheduler.client.report [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.950 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:43 compute-1 nova_compute[185910]: 2026-02-16 13:55:43.951 185914 DEBUG nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.028 185914 DEBUG nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.028 185914 DEBUG nova.network.neutron [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 13:55:44 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:55:44 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:55:44 compute-1 rsyslogd[1016]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.057 185914 INFO nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.079 185914 DEBUG nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.188 185914 DEBUG nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.190 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.191 185914 INFO nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Creating image(s)
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.192 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.192 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.194 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.221 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.271 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.272 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.272 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.284 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.330 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.331 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.363 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7,backing_fmt=raw /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.364 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "755ae6cb63977e865a71a8c38a1a67ce95c9d0b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.365 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.397 185914 DEBUG nova.policy [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a2a37907788d4195986dc759905dcc95', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '77712f67f33f426cb3d6d9b7a640f32a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.418 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/755ae6cb63977e865a71a8c38a1a67ce95c9d0b7 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.419 185914 DEBUG nova.virt.disk.api [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Checking if we can resize image /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.420 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.476 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.477 185914 DEBUG nova.virt.disk.api [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Cannot resize image /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.478 185914 DEBUG nova.objects.instance [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lazy-loading 'migration_context' on Instance uuid 3266d7e2-8d63-44ff-970a-45b95f88dc2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.538 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.539 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Ensure instance console log exists: /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.539 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.539 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:44 compute-1 nova_compute[185910]: 2026-02-16 13:55:44.540 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:45 compute-1 nova_compute[185910]: 2026-02-16 13:55:45.177 185914 DEBUG nova.network.neutron [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Successfully created port: 8d907fd7-6b02-461e-8612-e5f777af8eea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 13:55:45 compute-1 nova_compute[185910]: 2026-02-16 13:55:45.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:55:45 compute-1 nova_compute[185910]: 2026-02-16 13:55:45.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:55:45 compute-1 nova_compute[185910]: 2026-02-16 13:55:45.951 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:46 compute-1 nova_compute[185910]: 2026-02-16 13:55:46.771 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:47 compute-1 nova_compute[185910]: 2026-02-16 13:55:47.365 185914 DEBUG nova.network.neutron [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Successfully updated port: 8d907fd7-6b02-461e-8612-e5f777af8eea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 13:55:47 compute-1 nova_compute[185910]: 2026-02-16 13:55:47.386 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:55:47 compute-1 nova_compute[185910]: 2026-02-16 13:55:47.387 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquired lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:55:47 compute-1 nova_compute[185910]: 2026-02-16 13:55:47.387 185914 DEBUG nova.network.neutron [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 13:55:47 compute-1 nova_compute[185910]: 2026-02-16 13:55:47.592 185914 DEBUG nova.compute.manager [req-d3120b0f-c3f1-416d-bb2b-f281dbf96060 req-63c42ff2-5b43-4aab-ac80-d3e41fd14b9c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-changed-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:55:47 compute-1 nova_compute[185910]: 2026-02-16 13:55:47.593 185914 DEBUG nova.compute.manager [req-d3120b0f-c3f1-416d-bb2b-f281dbf96060 req-63c42ff2-5b43-4aab-ac80-d3e41fd14b9c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Refreshing instance network info cache due to event network-changed-8d907fd7-6b02-461e-8612-e5f777af8eea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:55:47 compute-1 nova_compute[185910]: 2026-02-16 13:55:47.593 185914 DEBUG oslo_concurrency.lockutils [req-d3120b0f-c3f1-416d-bb2b-f281dbf96060 req-63c42ff2-5b43-4aab-ac80-d3e41fd14b9c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:55:47 compute-1 nova_compute[185910]: 2026-02-16 13:55:47.648 185914 DEBUG nova.network.neutron [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 13:55:49 compute-1 openstack_network_exporter[198096]: ERROR   13:55:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:55:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:55:49 compute-1 openstack_network_exporter[198096]: ERROR   13:55:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:55:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:55:49 compute-1 podman[217190]: 2026-02-16 13:55:49.910893733 +0000 UTC m=+0.050701288 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 13:55:49 compute-1 podman[217189]: 2026-02-16 13:55:49.941056676 +0000 UTC m=+0.084514329 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.372 185914 DEBUG nova.network.neutron [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating instance_info_cache with network_info: [{"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.402 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Releasing lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.402 185914 DEBUG nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Instance network_info: |[{"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.403 185914 DEBUG oslo_concurrency.lockutils [req-d3120b0f-c3f1-416d-bb2b-f281dbf96060 req-63c42ff2-5b43-4aab-ac80-d3e41fd14b9c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.404 185914 DEBUG nova.network.neutron [req-d3120b0f-c3f1-416d-bb2b-f281dbf96060 req-63c42ff2-5b43-4aab-ac80-d3e41fd14b9c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Refreshing network info cache for port 8d907fd7-6b02-461e-8612-e5f777af8eea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.409 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Start _get_guest_xml network_info=[{"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encrypted': False, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_options': None, 'boot_index': 0, 'device_name': '/dev/vda', 'guest_format': None, 'encryption_secret_uuid': None, 'size': 0, 'image_id': '6fb9af7f-2971-4890-a777-6e99e888717f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.415 185914 WARNING nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.431 185914 DEBUG nova.virt.libvirt.host [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.432 185914 DEBUG nova.virt.libvirt.host [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.435 185914 DEBUG nova.virt.libvirt.host [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.436 185914 DEBUG nova.virt.libvirt.host [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.438 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.438 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T13:16:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d89f72c-1760-421e-a5f2-83dfc3723b84',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T13:17:01Z,direct_url=<?>,disk_format='qcow2',id=6fb9af7f-2971-4890-a777-6e99e888717f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fa8ec31824694513a42cc22c81880a5c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T13:17:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.439 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.440 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.440 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.441 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.441 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.442 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.443 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.443 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.444 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.444 185914 DEBUG nova.virt.hardware [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.451 185914 DEBUG nova.virt.libvirt.vif [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:55:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1379037604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1379037604',id=29,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='77712f67f33f426cb3d6d9b7a640f32a',ramdisk_id='',reservation_id='r-p6jzrmkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2141702843',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2141702843-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:55:44Z,user_data=None,user_id='a2a37907788d4195986dc759905dcc95',uuid=3266d7e2-8d63-44ff-970a-45b95f88dc2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.452 185914 DEBUG nova.network.os_vif_util [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Converting VIF {"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.453 185914 DEBUG nova.network.os_vif_util [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.455 185914 DEBUG nova.objects.instance [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lazy-loading 'pci_devices' on Instance uuid 3266d7e2-8d63-44ff-970a-45b95f88dc2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.502 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] End _get_guest_xml xml=<domain type="kvm">
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <uuid>3266d7e2-8d63-44ff-970a-45b95f88dc2f</uuid>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <name>instance-0000001d</name>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <memory>131072</memory>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <vcpu>1</vcpu>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <metadata>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1379037604</nova:name>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <nova:creationTime>2026-02-16 13:55:50</nova:creationTime>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <nova:flavor name="m1.nano">
Feb 16 13:55:50 compute-1 nova_compute[185910]:         <nova:memory>128</nova:memory>
Feb 16 13:55:50 compute-1 nova_compute[185910]:         <nova:disk>1</nova:disk>
Feb 16 13:55:50 compute-1 nova_compute[185910]:         <nova:swap>0</nova:swap>
Feb 16 13:55:50 compute-1 nova_compute[185910]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 13:55:50 compute-1 nova_compute[185910]:         <nova:vcpus>1</nova:vcpus>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       </nova:flavor>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <nova:owner>
Feb 16 13:55:50 compute-1 nova_compute[185910]:         <nova:user uuid="a2a37907788d4195986dc759905dcc95">tempest-TestExecuteZoneMigrationStrategy-2141702843-project-member</nova:user>
Feb 16 13:55:50 compute-1 nova_compute[185910]:         <nova:project uuid="77712f67f33f426cb3d6d9b7a640f32a">tempest-TestExecuteZoneMigrationStrategy-2141702843</nova:project>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       </nova:owner>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <nova:root type="image" uuid="6fb9af7f-2971-4890-a777-6e99e888717f"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <nova:ports>
Feb 16 13:55:50 compute-1 nova_compute[185910]:         <nova:port uuid="8d907fd7-6b02-461e-8612-e5f777af8eea">
Feb 16 13:55:50 compute-1 nova_compute[185910]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:         </nova:port>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       </nova:ports>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     </nova:instance>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   </metadata>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <sysinfo type="smbios">
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <system>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <entry name="manufacturer">RDO</entry>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <entry name="product">OpenStack Compute</entry>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <entry name="serial">3266d7e2-8d63-44ff-970a-45b95f88dc2f</entry>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <entry name="uuid">3266d7e2-8d63-44ff-970a-45b95f88dc2f</entry>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <entry name="family">Virtual Machine</entry>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     </system>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   </sysinfo>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <os>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <boot dev="hd"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <smbios mode="sysinfo"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   </os>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <features>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <acpi/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <apic/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <vmcoreinfo/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   </features>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <clock offset="utc">
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <timer name="hpet" present="no"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   </clock>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <cpu mode="custom" match="exact">
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <model>Nehalem</model>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   </cpu>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   <devices>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <disk type="file" device="disk">
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <target dev="vda" bus="virtio"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <disk type="file" device="cdrom">
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <source file="/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <target dev="sda" bus="sata"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     </disk>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <interface type="ethernet">
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <mac address="fa:16:3e:d1:53:06"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <mtu size="1442"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <target dev="tap8d907fd7-6b"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     </interface>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <serial type="pty">
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <log file="/var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/console.log" append="off"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     </serial>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <video>
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <model type="virtio"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     </video>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <input type="tablet" bus="usb"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <rng model="virtio">
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <backend model="random">/dev/urandom</backend>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     </rng>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <controller type="usb" index="0"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     <memballoon model="virtio">
Feb 16 13:55:50 compute-1 nova_compute[185910]:       <stats period="10"/>
Feb 16 13:55:50 compute-1 nova_compute[185910]:     </memballoon>
Feb 16 13:55:50 compute-1 nova_compute[185910]:   </devices>
Feb 16 13:55:50 compute-1 nova_compute[185910]: </domain>
Feb 16 13:55:50 compute-1 nova_compute[185910]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.503 185914 DEBUG nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Preparing to wait for external event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.504 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.505 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.505 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.506 185914 DEBUG nova.virt.libvirt.vif [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T13:55:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1379037604',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1379037604',id=29,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='77712f67f33f426cb3d6d9b7a640f32a',ramdisk_id='',reservation_id='r-p6jzrmkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2141702843',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2141702843-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T13:55:44Z,user_data=None,user_id='a2a37907788d4195986dc759905dcc95',uuid=3266d7e2-8d63-44ff-970a-45b95f88dc2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.506 185914 DEBUG nova.network.os_vif_util [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Converting VIF {"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.508 185914 DEBUG nova.network.os_vif_util [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.508 185914 DEBUG os_vif [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.509 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.510 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.510 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.516 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.517 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d907fd7-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.517 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8d907fd7-6b, col_values=(('external_ids', {'iface-id': '8d907fd7-6b02-461e-8612-e5f777af8eea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:53:06', 'vm-uuid': '3266d7e2-8d63-44ff-970a-45b95f88dc2f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.519 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:50 compute-1 NetworkManager[56388]: <info>  [1771250150.5204] manager: (tap8d907fd7-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.522 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.525 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.525 185914 INFO os_vif [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b')
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.606 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.606 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.607 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] No VIF found with MAC fa:16:3e:d1:53:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.608 185914 INFO nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Using config drive
Feb 16 13:55:50 compute-1 nova_compute[185910]: 2026-02-16 13:55:50.952 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:51 compute-1 nova_compute[185910]: 2026-02-16 13:55:51.717 185914 INFO nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Creating config drive at /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config
Feb 16 13:55:51 compute-1 nova_compute[185910]: 2026-02-16 13:55:51.722 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgd2kesm5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:55:51 compute-1 nova_compute[185910]: 2026-02-16 13:55:51.844 185914 DEBUG oslo_concurrency.processutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpgd2kesm5" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:55:51 compute-1 kernel: tap8d907fd7-6b: entered promiscuous mode
Feb 16 13:55:51 compute-1 NetworkManager[56388]: <info>  [1771250151.8980] manager: (tap8d907fd7-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Feb 16 13:55:51 compute-1 nova_compute[185910]: 2026-02-16 13:55:51.897 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:51 compute-1 ovn_controller[96285]: 2026-02-16T13:55:51Z|00221|binding|INFO|Claiming lport 8d907fd7-6b02-461e-8612-e5f777af8eea for this chassis.
Feb 16 13:55:51 compute-1 ovn_controller[96285]: 2026-02-16T13:55:51Z|00222|binding|INFO|8d907fd7-6b02-461e-8612-e5f777af8eea: Claiming fa:16:3e:d1:53:06 10.100.0.8
Feb 16 13:55:51 compute-1 nova_compute[185910]: 2026-02-16 13:55:51.901 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:51 compute-1 nova_compute[185910]: 2026-02-16 13:55:51.904 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.918 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:53:06 10.100.0.8'], port_security=['fa:16:3e:d1:53:06 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3266d7e2-8d63-44ff-970a-45b95f88dc2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09e98704-cf1f-47d1-8021-93211c7aa37e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77712f67f33f426cb3d6d9b7a640f32a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c1368845-2f7a-494d-9bee-474d9166c8a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4e8c351-7159-44d3-b122-efa9b0154fd9, chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=8d907fd7-6b02-461e-8612-e5f777af8eea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:55:51 compute-1 ovn_controller[96285]: 2026-02-16T13:55:51Z|00223|binding|INFO|Setting lport 8d907fd7-6b02-461e-8612-e5f777af8eea ovn-installed in OVS
Feb 16 13:55:51 compute-1 ovn_controller[96285]: 2026-02-16T13:55:51Z|00224|binding|INFO|Setting lport 8d907fd7-6b02-461e-8612-e5f777af8eea up in Southbound
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.919 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 8d907fd7-6b02-461e-8612-e5f777af8eea in datapath 09e98704-cf1f-47d1-8021-93211c7aa37e bound to our chassis
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.920 105573 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09e98704-cf1f-47d1-8021-93211c7aa37e
Feb 16 13:55:51 compute-1 nova_compute[185910]: 2026-02-16 13:55:51.920 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:51 compute-1 systemd-udevd[217246]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:55:51 compute-1 systemd-machined[155419]: New machine qemu-19-instance-0000001d.
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.931 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[2c052b6d-70ef-457d-b3ea-4cba6ae0e9f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.932 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09e98704-c1 in ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.935 206668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09e98704-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.935 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[d941980f-7b5c-44e7-8d4b-f958920bcdd0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:51 compute-1 NetworkManager[56388]: <info>  [1771250151.9360] device (tap8d907fd7-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 13:55:51 compute-1 NetworkManager[56388]: <info>  [1771250151.9365] device (tap8d907fd7-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.937 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[066796bb-f8af-4486-ae1f-f112f924b7d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:51 compute-1 systemd[1]: Started Virtual Machine qemu-19-instance-0000001d.
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.945 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[b3830ff2-4ab4-4ba2-a552-b84bb87e0167]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.956 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[4b015c1a-405c-4db7-b458-732de678311e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.979 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[e69ac0f5-2587-46a2-806a-c548afd0aa93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:51 compute-1 NetworkManager[56388]: <info>  [1771250151.9861] manager: (tap09e98704-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/86)
Feb 16 13:55:51 compute-1 systemd-udevd[217250]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 13:55:51 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:51.986 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3dce77-bc83-45ba-9d2c-11211fa82377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.011 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9a6739-187d-4517-bd42-2a583ec14032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.017 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[52405f0f-70f8-4dcf-9ef6-07b1884584a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 NetworkManager[56388]: <info>  [1771250152.0352] device (tap09e98704-c0): carrier: link connected
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.038 206713 DEBUG oslo.privsep.daemon [-] privsep: reply[5001cefa-936a-46a9-8fa8-5060f60b0b4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.051 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[c946e345-ef88-4d17-a944-484b2c57206e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09e98704-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:74:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623645, 'reachable_time': 17771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217280, 'error': None, 'target': 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.064 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[4126c14d-bf1b-4da2-882d-77a13001b408]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:7412'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623645, 'tstamp': 623645}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217281, 'error': None, 'target': 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.076 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a51d3582-04ae-4fcb-90c7-73ea67254203]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09e98704-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:74:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623645, 'reachable_time': 17771, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217282, 'error': None, 'target': 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.096 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[952b87aa-4464-4668-90c0-bf9a89dd538b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.131 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[09cefb71-21fc-4a8a-87a0-4ccac45f83c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.132 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09e98704-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.133 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.133 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09e98704-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.135 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:52 compute-1 NetworkManager[56388]: <info>  [1771250152.1362] manager: (tap09e98704-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Feb 16 13:55:52 compute-1 kernel: tap09e98704-c0: entered promiscuous mode
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.138 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09e98704-c0, col_values=(('external_ids', {'iface-id': 'eea5c447-c012-4beb-b864-a8e81dbeffa6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.139 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:52 compute-1 ovn_controller[96285]: 2026-02-16T13:55:52Z|00225|binding|INFO|Releasing lport eea5c447-c012-4beb-b864-a8e81dbeffa6 from this chassis (sb_readonly=0)
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.140 105573 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09e98704-cf1f-47d1-8021-93211c7aa37e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09e98704-cf1f-47d1-8021-93211c7aa37e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.143 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[da0bfb7c-722e-498e-8d8b-ac8d05202b1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.144 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.144 105573 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: global
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     log         /dev/log local0 debug
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     log-tag     haproxy-metadata-proxy-09e98704-cf1f-47d1-8021-93211c7aa37e
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     user        root
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     group       root
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     maxconn     1024
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     pidfile     /var/lib/neutron/external/pids/09e98704-cf1f-47d1-8021-93211c7aa37e.pid.haproxy
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     daemon
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: defaults
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     log global
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     mode http
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     option httplog
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     option dontlognull
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     option http-server-close
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     option forwardfor
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     retries                 3
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     timeout http-request    30s
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     timeout connect         30s
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     timeout client          32s
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     timeout server          32s
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     timeout http-keep-alive 30s
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: listen listener
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     bind 169.254.169.254:80
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:     http-request add-header X-OVN-Network-ID 09e98704-cf1f-47d1-8021-93211c7aa37e
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 13:55:52 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:55:52.145 105573 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'env', 'PROCESS_TAG=haproxy-09e98704-cf1f-47d1-8021-93211c7aa37e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09e98704-cf1f-47d1-8021-93211c7aa37e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.348 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771250152.3470843, 3266d7e2-8d63-44ff-970a-45b95f88dc2f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.348 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] VM Started (Lifecycle Event)
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.378 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.382 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771250152.3478942, 3266d7e2-8d63-44ff-970a-45b95f88dc2f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.382 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] VM Paused (Lifecycle Event)
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.404 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.407 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.438 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:55:52 compute-1 podman[217321]: 2026-02-16 13:55:52.462548392 +0000 UTC m=+0.040798451 container create 6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 13:55:52 compute-1 systemd[1]: Started libpod-conmon-6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5.scope.
Feb 16 13:55:52 compute-1 systemd[1]: Started libcrun container.
Feb 16 13:55:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00ede73563879764d667bf0d6e2b6d974cebf7fcd002c0f8ce96a688defd36a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 13:55:52 compute-1 podman[217321]: 2026-02-16 13:55:52.528741026 +0000 UTC m=+0.106991175 container init 6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:55:52 compute-1 podman[217321]: 2026-02-16 13:55:52.532818876 +0000 UTC m=+0.111068975 container start 6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 13:55:52 compute-1 podman[217321]: 2026-02-16 13:55:52.441252758 +0000 UTC m=+0.019502837 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 13:55:52 compute-1 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[217337]: [NOTICE]   (217341) : New worker (217343) forked
Feb 16 13:55:52 compute-1 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[217337]: [NOTICE]   (217341) : Loading success.
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.685 185914 DEBUG nova.compute.manager [req-bfc9c34a-4912-4ebd-aa45-1a1da234f26e req-cfc656fb-b9be-4b3b-9391-4d5b4be6f2f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.686 185914 DEBUG oslo_concurrency.lockutils [req-bfc9c34a-4912-4ebd-aa45-1a1da234f26e req-cfc656fb-b9be-4b3b-9391-4d5b4be6f2f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.686 185914 DEBUG oslo_concurrency.lockutils [req-bfc9c34a-4912-4ebd-aa45-1a1da234f26e req-cfc656fb-b9be-4b3b-9391-4d5b4be6f2f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.686 185914 DEBUG oslo_concurrency.lockutils [req-bfc9c34a-4912-4ebd-aa45-1a1da234f26e req-cfc656fb-b9be-4b3b-9391-4d5b4be6f2f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.687 185914 DEBUG nova.compute.manager [req-bfc9c34a-4912-4ebd-aa45-1a1da234f26e req-cfc656fb-b9be-4b3b-9391-4d5b4be6f2f8 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Processing event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.687 185914 DEBUG nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.692 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771250152.691677, 3266d7e2-8d63-44ff-970a-45b95f88dc2f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.692 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] VM Resumed (Lifecycle Event)
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.694 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.697 185914 INFO nova.virt.libvirt.driver [-] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Instance spawned successfully.
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.697 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.724 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.729 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.732 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.733 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.733 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.734 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.734 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.734 185914 DEBUG nova.virt.libvirt.driver [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.783 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.897 185914 INFO nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Took 8.71 seconds to spawn the instance on the hypervisor.
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.898 185914 DEBUG nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:55:52 compute-1 nova_compute[185910]: 2026-02-16 13:55:52.994 185914 INFO nova.compute.manager [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Took 9.31 seconds to build instance.
Feb 16 13:55:53 compute-1 nova_compute[185910]: 2026-02-16 13:55:53.037 185914 DEBUG oslo_concurrency.lockutils [None req-f5188114-998b-4742-87a2-cdc0c1603778 a2a37907788d4195986dc759905dcc95 77712f67f33f426cb3d6d9b7a640f32a - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:54 compute-1 nova_compute[185910]: 2026-02-16 13:55:54.845 185914 DEBUG nova.compute.manager [req-714c90c8-0399-4fcd-98dc-ecfa47d25850 req-1b143ef8-fb29-4250-a86e-615cc6afd3af faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:55:54 compute-1 nova_compute[185910]: 2026-02-16 13:55:54.846 185914 DEBUG oslo_concurrency.lockutils [req-714c90c8-0399-4fcd-98dc-ecfa47d25850 req-1b143ef8-fb29-4250-a86e-615cc6afd3af faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:55:54 compute-1 nova_compute[185910]: 2026-02-16 13:55:54.846 185914 DEBUG oslo_concurrency.lockutils [req-714c90c8-0399-4fcd-98dc-ecfa47d25850 req-1b143ef8-fb29-4250-a86e-615cc6afd3af faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:55:54 compute-1 nova_compute[185910]: 2026-02-16 13:55:54.846 185914 DEBUG oslo_concurrency.lockutils [req-714c90c8-0399-4fcd-98dc-ecfa47d25850 req-1b143ef8-fb29-4250-a86e-615cc6afd3af faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:55:54 compute-1 nova_compute[185910]: 2026-02-16 13:55:54.846 185914 DEBUG nova.compute.manager [req-714c90c8-0399-4fcd-98dc-ecfa47d25850 req-1b143ef8-fb29-4250-a86e-615cc6afd3af faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] No waiting events found dispatching network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:55:54 compute-1 nova_compute[185910]: 2026-02-16 13:55:54.847 185914 WARNING nova.compute.manager [req-714c90c8-0399-4fcd-98dc-ecfa47d25850 req-1b143ef8-fb29-4250-a86e-615cc6afd3af faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received unexpected event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea for instance with vm_state active and task_state None.
Feb 16 13:55:54 compute-1 nova_compute[185910]: 2026-02-16 13:55:54.973 185914 DEBUG nova.network.neutron [req-d3120b0f-c3f1-416d-bb2b-f281dbf96060 req-63c42ff2-5b43-4aab-ac80-d3e41fd14b9c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updated VIF entry in instance network info cache for port 8d907fd7-6b02-461e-8612-e5f777af8eea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:55:54 compute-1 nova_compute[185910]: 2026-02-16 13:55:54.974 185914 DEBUG nova.network.neutron [req-d3120b0f-c3f1-416d-bb2b-f281dbf96060 req-63c42ff2-5b43-4aab-ac80-d3e41fd14b9c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating instance_info_cache with network_info: [{"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:55:55 compute-1 nova_compute[185910]: 2026-02-16 13:55:55.010 185914 DEBUG oslo_concurrency.lockutils [req-d3120b0f-c3f1-416d-bb2b-f281dbf96060 req-63c42ff2-5b43-4aab-ac80-d3e41fd14b9c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:55:55 compute-1 nova_compute[185910]: 2026-02-16 13:55:55.522 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:55 compute-1 nova_compute[185910]: 2026-02-16 13:55:55.954 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:55:59 compute-1 podman[217352]: 2026-02-16 13:55:59.956620384 +0000 UTC m=+0.096412270 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:56:00 compute-1 nova_compute[185910]: 2026-02-16 13:56:00.569 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:00 compute-1 nova_compute[185910]: 2026-02-16 13:56:00.958 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:03.370 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:03.371 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:03.372 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:04 compute-1 ovn_controller[96285]: 2026-02-16T13:56:04Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:53:06 10.100.0.8
Feb 16 13:56:04 compute-1 ovn_controller[96285]: 2026-02-16T13:56:04Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:53:06 10.100.0.8
Feb 16 13:56:05 compute-1 nova_compute[185910]: 2026-02-16 13:56:05.573 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:05 compute-1 podman[195236]: time="2026-02-16T13:56:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:56:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:56:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:56:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:56:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 13:56:05 compute-1 nova_compute[185910]: 2026-02-16 13:56:05.959 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:06 compute-1 podman[217390]: 2026-02-16 13:56:06.911744568 +0000 UTC m=+0.048244762 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:56:10 compute-1 nova_compute[185910]: 2026-02-16 13:56:10.576 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:10 compute-1 nova_compute[185910]: 2026-02-16 13:56:10.962 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:15 compute-1 sshd-session[217416]: Invalid user postgres from 188.166.42.159 port 42078
Feb 16 13:56:15 compute-1 sshd-session[217416]: Connection closed by invalid user postgres 188.166.42.159 port 42078 [preauth]
Feb 16 13:56:15 compute-1 nova_compute[185910]: 2026-02-16 13:56:15.579 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:15 compute-1 nova_compute[185910]: 2026-02-16 13:56:15.964 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:16 compute-1 sshd-session[217418]: Invalid user test from 146.190.226.24 port 41160
Feb 16 13:56:16 compute-1 sshd-session[217418]: Connection closed by invalid user test 146.190.226.24 port 41160 [preauth]
Feb 16 13:56:19 compute-1 openstack_network_exporter[198096]: ERROR   13:56:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:56:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:56:19 compute-1 openstack_network_exporter[198096]: ERROR   13:56:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:56:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:56:20 compute-1 nova_compute[185910]: 2026-02-16 13:56:20.611 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:20 compute-1 podman[217421]: 2026-02-16 13:56:20.921189689 +0000 UTC m=+0.055579959 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 13:56:20 compute-1 podman[217420]: 2026-02-16 13:56:20.922435753 +0000 UTC m=+0.056659239 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 16 13:56:20 compute-1 nova_compute[185910]: 2026-02-16 13:56:20.967 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:22 compute-1 ovn_controller[96285]: 2026-02-16T13:56:22Z|00226|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 16 13:56:25 compute-1 nova_compute[185910]: 2026-02-16 13:56:25.614 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:25 compute-1 nova_compute[185910]: 2026-02-16 13:56:25.970 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:26 compute-1 nova_compute[185910]: 2026-02-16 13:56:26.126 185914 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Check if temp file /var/lib/nova/instances/tmphtyz6pdh exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 13:56:26 compute-1 nova_compute[185910]: 2026-02-16 13:56:26.126 185914 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtyz6pdh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3266d7e2-8d63-44ff-970a-45b95f88dc2f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 13:56:26 compute-1 nova_compute[185910]: 2026-02-16 13:56:26.837 185914 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:56:26 compute-1 nova_compute[185910]: 2026-02-16 13:56:26.894 185914 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:56:26 compute-1 nova_compute[185910]: 2026-02-16 13:56:26.895 185914 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 13:56:26 compute-1 nova_compute[185910]: 2026-02-16 13:56:26.948 185914 DEBUG oslo_concurrency.processutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 13:56:29 compute-1 sshd-session[217465]: Accepted publickey for nova from 192.168.122.100 port 53132 ssh2: ECDSA SHA256:U309eBAZgvPXicX2lI3ib2903RjOpPXbPKVddWOb314
Feb 16 13:56:29 compute-1 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 13:56:29 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 13:56:29 compute-1 systemd-logind[821]: New session 50 of user nova.
Feb 16 13:56:29 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 13:56:29 compute-1 systemd[1]: Starting User Manager for UID 42436...
Feb 16 13:56:29 compute-1 systemd[217469]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:56:29 compute-1 nova_compute[185910]: 2026-02-16 13:56:29.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:29 compute-1 systemd[217469]: Queued start job for default target Main User Target.
Feb 16 13:56:29 compute-1 systemd[217469]: Created slice User Application Slice.
Feb 16 13:56:29 compute-1 systemd[217469]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:56:29 compute-1 systemd[217469]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 13:56:29 compute-1 systemd[217469]: Reached target Paths.
Feb 16 13:56:29 compute-1 systemd[217469]: Reached target Timers.
Feb 16 13:56:29 compute-1 systemd[217469]: Starting D-Bus User Message Bus Socket...
Feb 16 13:56:29 compute-1 systemd[217469]: Starting Create User's Volatile Files and Directories...
Feb 16 13:56:29 compute-1 systemd[217469]: Finished Create User's Volatile Files and Directories.
Feb 16 13:56:29 compute-1 systemd[217469]: Listening on D-Bus User Message Bus Socket.
Feb 16 13:56:29 compute-1 systemd[217469]: Reached target Sockets.
Feb 16 13:56:29 compute-1 systemd[217469]: Reached target Basic System.
Feb 16 13:56:29 compute-1 systemd[217469]: Reached target Main User Target.
Feb 16 13:56:29 compute-1 systemd[217469]: Startup finished in 158ms.
Feb 16 13:56:29 compute-1 systemd[1]: Started User Manager for UID 42436.
Feb 16 13:56:29 compute-1 systemd[1]: Started Session 50 of User nova.
Feb 16 13:56:29 compute-1 sshd-session[217465]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 13:56:29 compute-1 sshd-session[217484]: Received disconnect from 192.168.122.100 port 53132:11: disconnected by user
Feb 16 13:56:29 compute-1 sshd-session[217484]: Disconnected from user nova 192.168.122.100 port 53132
Feb 16 13:56:29 compute-1 sshd-session[217465]: pam_unix(sshd:session): session closed for user nova
Feb 16 13:56:29 compute-1 systemd[1]: session-50.scope: Deactivated successfully.
Feb 16 13:56:29 compute-1 systemd-logind[821]: Session 50 logged out. Waiting for processes to exit.
Feb 16 13:56:29 compute-1 systemd-logind[821]: Removed session 50.
Feb 16 13:56:30 compute-1 nova_compute[185910]: 2026-02-16 13:56:30.617 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:30 compute-1 nova_compute[185910]: 2026-02-16 13:56:30.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:30 compute-1 podman[217486]: 2026-02-16 13:56:30.956310534 +0000 UTC m=+0.090282064 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 16 13:56:30 compute-1 nova_compute[185910]: 2026-02-16 13:56:30.971 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:30 compute-1 nova_compute[185910]: 2026-02-16 13:56:30.976 185914 DEBUG nova.compute.manager [req-854acb4c-a8c1-4132-8ed6-e5f9b80383d9 req-97f365c1-15d7-4e65-a8e5-9a73a688479c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-unplugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:30 compute-1 nova_compute[185910]: 2026-02-16 13:56:30.976 185914 DEBUG oslo_concurrency.lockutils [req-854acb4c-a8c1-4132-8ed6-e5f9b80383d9 req-97f365c1-15d7-4e65-a8e5-9a73a688479c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:30 compute-1 nova_compute[185910]: 2026-02-16 13:56:30.977 185914 DEBUG oslo_concurrency.lockutils [req-854acb4c-a8c1-4132-8ed6-e5f9b80383d9 req-97f365c1-15d7-4e65-a8e5-9a73a688479c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:30 compute-1 nova_compute[185910]: 2026-02-16 13:56:30.977 185914 DEBUG oslo_concurrency.lockutils [req-854acb4c-a8c1-4132-8ed6-e5f9b80383d9 req-97f365c1-15d7-4e65-a8e5-9a73a688479c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:30 compute-1 nova_compute[185910]: 2026-02-16 13:56:30.977 185914 DEBUG nova.compute.manager [req-854acb4c-a8c1-4132-8ed6-e5f9b80383d9 req-97f365c1-15d7-4e65-a8e5-9a73a688479c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] No waiting events found dispatching network-vif-unplugged-8d907fd7-6b02-461e-8612-e5f777af8eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:56:30 compute-1 nova_compute[185910]: 2026-02-16 13:56:30.977 185914 DEBUG nova.compute.manager [req-854acb4c-a8c1-4132-8ed6-e5f9b80383d9 req-97f365c1-15d7-4e65-a8e5-9a73a688479c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-unplugged-8d907fd7-6b02-461e-8612-e5f777af8eea for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:56:31 compute-1 nova_compute[185910]: 2026-02-16 13:56:31.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.477 185914 INFO nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Took 5.53 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.478 185914 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.518 185914 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphtyz6pdh',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3266d7e2-8d63-44ff-970a-45b95f88dc2f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(f030fcd6-f4e2-43f5-8bfe-798430c65047),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.556 185914 DEBUG nova.objects.instance [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lazy-loading 'migration_context' on Instance uuid 3266d7e2-8d63-44ff-970a-45b95f88dc2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.558 185914 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.561 185914 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.561 185914 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.579 185914 DEBUG nova.virt.libvirt.vif [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:55:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1379037604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1379037604',id=29,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:55:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='77712f67f33f426cb3d6d9b7a640f32a',ramdisk_id='',reservation_id='r-p6jzrmkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2141702843',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2141702843-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:55:52Z,user_data=None,user_id='a2a37907788d4195986dc759905dcc95',uuid=3266d7e2-8d63-44ff-970a-45b95f88dc2f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.580 185914 DEBUG nova.network.os_vif_util [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.581 185914 DEBUG nova.network.os_vif_util [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.582 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 13:56:32 compute-1 nova_compute[185910]:   <mac address="fa:16:3e:d1:53:06"/>
Feb 16 13:56:32 compute-1 nova_compute[185910]:   <model type="virtio"/>
Feb 16 13:56:32 compute-1 nova_compute[185910]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 13:56:32 compute-1 nova_compute[185910]:   <mtu size="1442"/>
Feb 16 13:56:32 compute-1 nova_compute[185910]:   <target dev="tap8d907fd7-6b"/>
Feb 16 13:56:32 compute-1 nova_compute[185910]: </interface>
Feb 16 13:56:32 compute-1 nova_compute[185910]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.582 185914 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 13:56:32 compute-1 nova_compute[185910]: 2026-02-16 13:56:32.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.064 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.065 185914 INFO nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.097 185914 DEBUG nova.compute.manager [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.098 185914 DEBUG oslo_concurrency.lockutils [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.098 185914 DEBUG oslo_concurrency.lockutils [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.099 185914 DEBUG oslo_concurrency.lockutils [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.099 185914 DEBUG nova.compute.manager [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] No waiting events found dispatching network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.099 185914 WARNING nova.compute.manager [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received unexpected event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea for instance with vm_state active and task_state migrating.
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.099 185914 DEBUG nova.compute.manager [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-changed-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.100 185914 DEBUG nova.compute.manager [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Refreshing instance network info cache due to event network-changed-8d907fd7-6b02-461e-8612-e5f777af8eea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.100 185914 DEBUG oslo_concurrency.lockutils [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.100 185914 DEBUG oslo_concurrency.lockutils [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquired lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.101 185914 DEBUG nova.network.neutron [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Refreshing network info cache for port 8d907fd7-6b02-461e-8612-e5f777af8eea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.167 185914 INFO nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.670 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:56:33 compute-1 nova_compute[185910]: 2026-02-16 13:56:33.671 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:56:34 compute-1 nova_compute[185910]: 2026-02-16 13:56:34.174 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:56:34 compute-1 nova_compute[185910]: 2026-02-16 13:56:34.175 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:56:34 compute-1 nova_compute[185910]: 2026-02-16 13:56:34.679 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:56:34 compute-1 nova_compute[185910]: 2026-02-16 13:56:34.680 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:56:35 compute-1 nova_compute[185910]: 2026-02-16 13:56:35.184 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:56:35 compute-1 nova_compute[185910]: 2026-02-16 13:56:35.185 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:56:35 compute-1 nova_compute[185910]: 2026-02-16 13:56:35.619 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:35 compute-1 podman[195236]: time="2026-02-16T13:56:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:56:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:56:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17244 "" "Go-http-client/1.1"
Feb 16 13:56:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:56:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 16 13:56:35 compute-1 nova_compute[185910]: 2026-02-16 13:56:35.689 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:56:35 compute-1 nova_compute[185910]: 2026-02-16 13:56:35.690 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:56:35 compute-1 nova_compute[185910]: 2026-02-16 13:56:35.973 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.194 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.195 185914 DEBUG nova.virt.libvirt.migration [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.425 185914 DEBUG nova.virt.driver [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] Emitting event <LifecycleEvent: 1771250196.425119, 3266d7e2-8d63-44ff-970a-45b95f88dc2f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.426 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] VM Paused (Lifecycle Event)
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.460 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.467 185914 DEBUG nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.501 185914 DEBUG nova.network.neutron [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updated VIF entry in instance network info cache for port 8d907fd7-6b02-461e-8612-e5f777af8eea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.502 185914 DEBUG nova.network.neutron [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating instance_info_cache with network_info: [{"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.513 185914 INFO nova.compute.manager [None req-d8224c6f-6cf2-4bcf-bfb6-de0edac557e9 - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.540 185914 DEBUG oslo_concurrency.lockutils [req-99ed3b6d-e744-4356-aad7-2bd335727645 req-1743ba27-9802-4ea1-aada-c5bcad485ea4 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Releasing lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:56:36 compute-1 kernel: tap8d907fd7-6b (unregistering): left promiscuous mode
Feb 16 13:56:36 compute-1 NetworkManager[56388]: <info>  [1771250196.5705] device (tap8d907fd7-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 13:56:36 compute-1 ovn_controller[96285]: 2026-02-16T13:56:36Z|00227|binding|INFO|Releasing lport 8d907fd7-6b02-461e-8612-e5f777af8eea from this chassis (sb_readonly=0)
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.579 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-1 ovn_controller[96285]: 2026-02-16T13:56:36Z|00228|binding|INFO|Setting lport 8d907fd7-6b02-461e-8612-e5f777af8eea down in Southbound
Feb 16 13:56:36 compute-1 ovn_controller[96285]: 2026-02-16T13:56:36Z|00229|binding|INFO|Removing iface tap8d907fd7-6b ovn-installed in OVS
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.583 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.589 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.591 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:53:06 10.100.0.8'], port_security=['fa:16:3e:d1:53:06 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'b0e583b2-47d7-4bde-bbd6-282143e0c194'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3266d7e2-8d63-44ff-970a-45b95f88dc2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09e98704-cf1f-47d1-8021-93211c7aa37e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '77712f67f33f426cb3d6d9b7a640f32a', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c1368845-2f7a-494d-9bee-474d9166c8a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4e8c351-7159-44d3-b122-efa9b0154fd9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>], logical_port=8d907fd7-6b02-461e-8612-e5f777af8eea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4c895d6640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.592 105573 INFO neutron.agent.ovn.metadata.agent [-] Port 8d907fd7-6b02-461e-8612-e5f777af8eea in datapath 09e98704-cf1f-47d1-8021-93211c7aa37e unbound from our chassis
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.594 105573 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09e98704-cf1f-47d1-8021-93211c7aa37e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.595 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[bfa1e466-b16e-4e3b-a572-1ce6c5155aa8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.596 105573 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e namespace which is not needed anymore
Feb 16 13:56:36 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Feb 16 13:56:36 compute-1 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000001d.scope: Consumed 13.258s CPU time.
Feb 16 13:56:36 compute-1 systemd-machined[155419]: Machine qemu-19-instance-0000001d terminated.
Feb 16 13:56:36 compute-1 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[217337]: [NOTICE]   (217341) : haproxy version is 2.8.14-c23fe91
Feb 16 13:56:36 compute-1 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[217337]: [NOTICE]   (217341) : path to executable is /usr/sbin/haproxy
Feb 16 13:56:36 compute-1 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[217337]: [WARNING]  (217341) : Exiting Master process...
Feb 16 13:56:36 compute-1 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[217337]: [ALERT]    (217341) : Current worker (217343) exited with code 143 (Terminated)
Feb 16 13:56:36 compute-1 neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e[217337]: [WARNING]  (217341) : All workers exited. Exiting... (0)
Feb 16 13:56:36 compute-1 systemd[1]: libpod-6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5.scope: Deactivated successfully.
Feb 16 13:56:36 compute-1 podman[217547]: 2026-02-16 13:56:36.750372112 +0000 UTC m=+0.046494974 container died 6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.761 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.764 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5-userdata-shm.mount: Deactivated successfully.
Feb 16 13:56:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-00ede73563879764d667bf0d6e2b6d974cebf7fcd002c0f8ce96a688defd36a2-merged.mount: Deactivated successfully.
Feb 16 13:56:36 compute-1 podman[217547]: 2026-02-16 13:56:36.796451124 +0000 UTC m=+0.092573986 container cleanup 6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 16 13:56:36 compute-1 systemd[1]: libpod-conmon-6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5.scope: Deactivated successfully.
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.805 185914 DEBUG nova.virt.libvirt.guest [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.805 185914 INFO nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Migration operation has completed
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.806 185914 INFO nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] _post_live_migration() is started..
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.812 185914 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.812 185914 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.812 185914 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 13:56:36 compute-1 podman[217590]: 2026-02-16 13:56:36.861635081 +0000 UTC m=+0.044164871 container remove 6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.866 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[98748c9b-4bef-45c4-aaac-9698f5687edd]: (4, ('Mon Feb 16 01:56:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e (6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5)\n6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5\nMon Feb 16 01:56:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e (6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5)\n6abb3c9798ed092d66f6e71e83bbc881bdb55369457eca31f6f7b696ead231b5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.869 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[7da67888-7405-4dfa-80e7-513c6fb088bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.870 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09e98704-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.873 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-1 kernel: tap09e98704-c0: left promiscuous mode
Feb 16 13:56:36 compute-1 nova_compute[185910]: 2026-02-16 13:56:36.883 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.886 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[81260458-c586-4e63-a29c-d06f470bc741]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.904 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[8d9f1d9e-645b-4962-869e-1608e09447f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.906 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[a198e0c1-49f5-4887-b978-baf553a260ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.923 206668 DEBUG oslo.privsep.daemon [-] privsep: reply[5698c966-2018-45cf-92cd-17cd9018cdae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623639, 'reachable_time': 23388, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217609, 'error': None, 'target': 'ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:36 compute-1 systemd[1]: run-netns-ovnmeta\x2d09e98704\x2dcf1f\x2d47d1\x2d8021\x2d93211c7aa37e.mount: Deactivated successfully.
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.928 106042 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09e98704-cf1f-47d1-8021-93211c7aa37e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 13:56:36 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:36.928 106042 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8d6688-8ffd-4c56-a6b0-8c6060a1b6cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 13:56:37 compute-1 podman[217610]: 2026-02-16 13:56:37.007486753 +0000 UTC m=+0.056070713 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.668 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.669 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.669 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.669 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.731 185914 DEBUG nova.compute.manager [req-da74b35b-e37c-4027-8929-024eb437a8a1 req-65a9da09-82ea-4305-892d-757fd9f1845c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-unplugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.731 185914 DEBUG oslo_concurrency.lockutils [req-da74b35b-e37c-4027-8929-024eb437a8a1 req-65a9da09-82ea-4305-892d-757fd9f1845c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.731 185914 DEBUG oslo_concurrency.lockutils [req-da74b35b-e37c-4027-8929-024eb437a8a1 req-65a9da09-82ea-4305-892d-757fd9f1845c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.732 185914 DEBUG oslo_concurrency.lockutils [req-da74b35b-e37c-4027-8929-024eb437a8a1 req-65a9da09-82ea-4305-892d-757fd9f1845c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.732 185914 DEBUG nova.compute.manager [req-da74b35b-e37c-4027-8929-024eb437a8a1 req-65a9da09-82ea-4305-892d-757fd9f1845c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] No waiting events found dispatching network-vif-unplugged-8d907fd7-6b02-461e-8612-e5f777af8eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.732 185914 DEBUG nova.compute.manager [req-da74b35b-e37c-4027-8929-024eb437a8a1 req-65a9da09-82ea-4305-892d-757fd9f1845c faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-unplugged-8d907fd7-6b02-461e-8612-e5f777af8eea for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.840 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.841 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5798MB free_disk=73.1943359375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.841 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.842 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:37 compute-1 nova_compute[185910]: 2026-02-16 13:56:37.906 185914 INFO nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating resource usage from migration f030fcd6-f4e2-43f5-8bfe-798430c65047
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.006 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Migration f030fcd6-f4e2-43f5-8bfe-798430c65047 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.006 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.007 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.029 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing inventories for resource provider 63898862-3dd6-49b3-9545-63882243296a _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.055 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating ProviderTree inventory for provider 63898862-3dd6-49b3-9545-63882243296a from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.056 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Updating inventory in ProviderTree for provider 63898862-3dd6-49b3-9545-63882243296a with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.072 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing aggregate associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.097 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Refreshing trait associations for resource provider 63898862-3dd6-49b3-9545-63882243296a, traits: COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.153 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.169 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.195 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:56:38 compute-1 nova_compute[185910]: 2026-02-16 13:56:38.196 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.190 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:39.285 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:56:39 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:39.286 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.310 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.324 185914 DEBUG nova.network.neutron [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Activated binding for port 8d907fd7-6b02-461e-8612-e5f777af8eea and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.325 185914 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.326 185914 DEBUG nova.virt.libvirt.vif [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T13:55:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1379037604',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1379037604',id=29,image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T13:55:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='77712f67f33f426cb3d6d9b7a640f32a',ramdisk_id='',reservation_id='r-p6jzrmkx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6fb9af7f-2971-4890-a777-6e99e888717f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-2141702843',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-2141702843-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T13:56:22Z,user_data=None,user_id='a2a37907788d4195986dc759905dcc95',uuid=3266d7e2-8d63-44ff-970a-45b95f88dc2f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.327 185914 DEBUG nova.network.os_vif_util [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converting VIF {"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.328 185914 DEBUG nova.network.os_vif_util [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.329 185914 DEBUG os_vif [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.332 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.333 185914 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d907fd7-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.336 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.338 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.341 185914 INFO os_vif [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:53:06,bridge_name='br-int',has_traffic_filtering=True,id=8d907fd7-6b02-461e-8612-e5f777af8eea,network=Network(09e98704-cf1f-47d1-8021-93211c7aa37e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d907fd7-6b')
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.342 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.343 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.343 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.343 185914 DEBUG nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.344 185914 INFO nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Deleting instance files /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f_del
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.345 185914 INFO nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Deletion of /var/lib/nova/instances/3266d7e2-8d63-44ff-970a-45b95f88dc2f_del complete
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.897 185914 DEBUG nova.compute.manager [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.898 185914 DEBUG oslo_concurrency.lockutils [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.899 185914 DEBUG oslo_concurrency.lockutils [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.899 185914 DEBUG oslo_concurrency.lockutils [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.899 185914 DEBUG nova.compute.manager [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] No waiting events found dispatching network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.899 185914 WARNING nova.compute.manager [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received unexpected event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea for instance with vm_state active and task_state migrating.
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.900 185914 DEBUG nova.compute.manager [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.900 185914 DEBUG oslo_concurrency.lockutils [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.900 185914 DEBUG oslo_concurrency.lockutils [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.900 185914 DEBUG oslo_concurrency.lockutils [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.901 185914 DEBUG nova.compute.manager [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] No waiting events found dispatching network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.901 185914 WARNING nova.compute.manager [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received unexpected event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea for instance with vm_state active and task_state migrating.
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.901 185914 DEBUG nova.compute.manager [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.902 185914 DEBUG oslo_concurrency.lockutils [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.902 185914 DEBUG oslo_concurrency.lockutils [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.902 185914 DEBUG oslo_concurrency.lockutils [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.902 185914 DEBUG nova.compute.manager [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] No waiting events found dispatching network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 13:56:39 compute-1 nova_compute[185910]: 2026-02-16 13:56:39.903 185914 WARNING nova.compute.manager [req-6327a4c3-e17b-40ef-9b8b-65cb22d916e7 req-04efd9e3-dfc3-48cd-ae3f-862289201691 faac0d122d2542618616237dd42c4a59 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Received unexpected event network-vif-plugged-8d907fd7-6b02-461e-8612-e5f777af8eea for instance with vm_state active and task_state migrating.
Feb 16 13:56:40 compute-1 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 13:56:40 compute-1 systemd[217469]: Activating special unit Exit the Session...
Feb 16 13:56:40 compute-1 systemd[217469]: Stopped target Main User Target.
Feb 16 13:56:40 compute-1 systemd[217469]: Stopped target Basic System.
Feb 16 13:56:40 compute-1 systemd[217469]: Stopped target Paths.
Feb 16 13:56:40 compute-1 systemd[217469]: Stopped target Sockets.
Feb 16 13:56:40 compute-1 systemd[217469]: Stopped target Timers.
Feb 16 13:56:40 compute-1 systemd[217469]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 13:56:40 compute-1 systemd[217469]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 13:56:40 compute-1 systemd[217469]: Closed D-Bus User Message Bus Socket.
Feb 16 13:56:40 compute-1 systemd[217469]: Stopped Create User's Volatile Files and Directories.
Feb 16 13:56:40 compute-1 systemd[217469]: Removed slice User Application Slice.
Feb 16 13:56:40 compute-1 systemd[217469]: Reached target Shutdown.
Feb 16 13:56:40 compute-1 systemd[217469]: Finished Exit the Session.
Feb 16 13:56:40 compute-1 systemd[217469]: Reached target Exit the Session.
Feb 16 13:56:40 compute-1 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 13:56:40 compute-1 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 13:56:40 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 13:56:40 compute-1 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 13:56:40 compute-1 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 13:56:40 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 13:56:40 compute-1 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 13:56:40 compute-1 nova_compute[185910]: 2026-02-16 13:56:40.976 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:42 compute-1 nova_compute[185910]: 2026-02-16 13:56:42.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:42 compute-1 nova_compute[185910]: 2026-02-16 13:56:42.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:56:42 compute-1 nova_compute[185910]: 2026-02-16 13:56:42.631 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:56:42 compute-1 nova_compute[185910]: 2026-02-16 13:56:42.663 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 13:56:42 compute-1 nova_compute[185910]: 2026-02-16 13:56:42.663 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquired lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 13:56:42 compute-1 nova_compute[185910]: 2026-02-16 13:56:42.664 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 13:56:42 compute-1 nova_compute[185910]: 2026-02-16 13:56:42.664 185914 DEBUG nova.objects.instance [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3266d7e2-8d63-44ff-970a-45b95f88dc2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 13:56:44 compute-1 nova_compute[185910]: 2026-02-16 13:56:44.336 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:45 compute-1 nova_compute[185910]: 2026-02-16 13:56:45.977 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:47 compute-1 nova_compute[185910]: 2026-02-16 13:56:47.554 185914 DEBUG nova.network.neutron [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updating instance_info_cache with network_info: [{"id": "8d907fd7-6b02-461e-8612-e5f777af8eea", "address": "fa:16:3e:d1:53:06", "network": {"id": "09e98704-cf1f-47d1-8021-93211c7aa37e", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1331653241-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "77712f67f33f426cb3d6d9b7a640f32a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d907fd7-6b", "ovs_interfaceid": "8d907fd7-6b02-461e-8612-e5f777af8eea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 13:56:47 compute-1 nova_compute[185910]: 2026-02-16 13:56:47.583 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Releasing lock "refresh_cache-3266d7e2-8d63-44ff-970a-45b95f88dc2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 13:56:47 compute-1 nova_compute[185910]: 2026-02-16 13:56:47.584 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 13:56:47 compute-1 nova_compute[185910]: 2026-02-16 13:56:47.585 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:56:47 compute-1 nova_compute[185910]: 2026-02-16 13:56:47.585 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:56:48 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:56:48.288 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.525 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.526 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.526 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "3266d7e2-8d63-44ff-970a-45b95f88dc2f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.555 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.556 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.556 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.557 185914 DEBUG nova.compute.resource_tracker [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.699 185914 WARNING nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.700 185914 DEBUG nova.compute.resource_tracker [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5811MB free_disk=73.22299575805664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.700 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.700 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.795 185914 DEBUG nova.compute.resource_tracker [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration for instance 3266d7e2-8d63-44ff-970a-45b95f88dc2f refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.870 185914 DEBUG nova.compute.resource_tracker [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.922 185914 DEBUG nova.compute.resource_tracker [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Migration f030fcd6-f4e2-43f5-8bfe-798430c65047 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.923 185914 DEBUG nova.compute.resource_tracker [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.923 185914 DEBUG nova.compute.resource_tracker [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:56:48 compute-1 nova_compute[185910]: 2026-02-16 13:56:48.975 185914 DEBUG nova.compute.provider_tree [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:56:49 compute-1 nova_compute[185910]: 2026-02-16 13:56:49.027 185914 DEBUG nova.scheduler.client.report [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:56:49 compute-1 nova_compute[185910]: 2026-02-16 13:56:49.092 185914 DEBUG nova.compute.resource_tracker [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:56:49 compute-1 nova_compute[185910]: 2026-02-16 13:56:49.092 185914 DEBUG oslo_concurrency.lockutils [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:56:49 compute-1 nova_compute[185910]: 2026-02-16 13:56:49.097 185914 INFO nova.compute.manager [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Migrating instance to compute-0.ctlplane.example.com finished successfully.
Feb 16 13:56:49 compute-1 nova_compute[185910]: 2026-02-16 13:56:49.323 185914 INFO nova.scheduler.client.report [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] Deleted allocation for migration f030fcd6-f4e2-43f5-8bfe-798430c65047
Feb 16 13:56:49 compute-1 nova_compute[185910]: 2026-02-16 13:56:49.323 185914 DEBUG nova.virt.libvirt.driver [None req-e6b0b2de-d7dc-4ef7-9379-840c95d8b8ab bcb37b9a6e1349fa8fa1c516fd117b36 3c2003c34c504dc8b9400b60c520789d - - default default] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 13:56:49 compute-1 nova_compute[185910]: 2026-02-16 13:56:49.340 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:49 compute-1 openstack_network_exporter[198096]: ERROR   13:56:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:56:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:56:49 compute-1 openstack_network_exporter[198096]: ERROR   13:56:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:56:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:56:50 compute-1 nova_compute[185910]: 2026-02-16 13:56:50.979 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:51 compute-1 nova_compute[185910]: 2026-02-16 13:56:51.805 185914 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771250196.804458, 3266d7e2-8d63-44ff-970a-45b95f88dc2f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 13:56:51 compute-1 nova_compute[185910]: 2026-02-16 13:56:51.806 185914 INFO nova.compute.manager [-] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] VM Stopped (Lifecycle Event)
Feb 16 13:56:51 compute-1 podman[217638]: 2026-02-16 13:56:51.923281745 +0000 UTC m=+0.067140551 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=)
Feb 16 13:56:51 compute-1 nova_compute[185910]: 2026-02-16 13:56:51.931 185914 DEBUG nova.compute.manager [None req-88df94fc-8cea-4936-9e60-83b4374e2a1d - - - - - -] [instance: 3266d7e2-8d63-44ff-970a-45b95f88dc2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 13:56:51 compute-1 podman[217639]: 2026-02-16 13:56:51.934851587 +0000 UTC m=+0.078121047 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 16 13:56:54 compute-1 nova_compute[185910]: 2026-02-16 13:56:54.342 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:55 compute-1 nova_compute[185910]: 2026-02-16 13:56:55.982 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:56:59 compute-1 nova_compute[185910]: 2026-02-16 13:56:59.346 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:00 compute-1 nova_compute[185910]: 2026-02-16 13:57:00.985 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:01 compute-1 podman[217676]: 2026-02-16 13:57:01.085216832 +0000 UTC m=+0.071012635 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 16 13:57:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:57:03.372 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:57:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:57:03.372 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:57:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:57:03.373 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:57:04 compute-1 nova_compute[185910]: 2026-02-16 13:57:04.367 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:05 compute-1 podman[195236]: time="2026-02-16T13:57:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:57:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:57:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:57:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:57:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 13:57:05 compute-1 nova_compute[185910]: 2026-02-16 13:57:05.987 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:06 compute-1 sshd-session[217704]: Invalid user postgres from 188.166.42.159 port 52620
Feb 16 13:57:06 compute-1 sshd-session[217704]: Connection closed by invalid user postgres 188.166.42.159 port 52620 [preauth]
Feb 16 13:57:07 compute-1 podman[217706]: 2026-02-16 13:57:07.937125523 +0000 UTC m=+0.073453731 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 13:57:09 compute-1 nova_compute[185910]: 2026-02-16 13:57:09.370 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:10 compute-1 nova_compute[185910]: 2026-02-16 13:57:10.989 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:14 compute-1 nova_compute[185910]: 2026-02-16 13:57:14.373 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:15 compute-1 nova_compute[185910]: 2026-02-16 13:57:15.991 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:19 compute-1 nova_compute[185910]: 2026-02-16 13:57:19.377 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:19 compute-1 openstack_network_exporter[198096]: ERROR   13:57:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:57:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:57:19 compute-1 openstack_network_exporter[198096]: ERROR   13:57:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:57:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:57:19 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:57:19.772 105573 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '52:96:1b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '16:6c:7a:bd:49:39'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 13:57:19 compute-1 nova_compute[185910]: 2026-02-16 13:57:19.773 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:19 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:57:19.773 105573 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 13:57:19 compute-1 nova_compute[185910]: 2026-02-16 13:57:19.939 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:20 compute-1 nova_compute[185910]: 2026-02-16 13:57:20.992 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:22 compute-1 podman[217732]: 2026-02-16 13:57:22.907146017 +0000 UTC m=+0.044969273 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 13:57:22 compute-1 podman[217731]: 2026-02-16 13:57:22.937849445 +0000 UTC m=+0.078646771 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Feb 16 13:57:23 compute-1 sshd-session[217768]: Invalid user oracle from 146.190.226.24 port 48524
Feb 16 13:57:23 compute-1 sshd-session[217768]: Connection closed by invalid user oracle 146.190.226.24 port 48524 [preauth]
Feb 16 13:57:24 compute-1 nova_compute[185910]: 2026-02-16 13:57:24.380 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:25 compute-1 nova_compute[185910]: 2026-02-16 13:57:25.994 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:26 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:57:26.775 105573 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=54c1a259-778a-4222-b2c6-8422ea19a065, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 13:57:29 compute-1 nova_compute[185910]: 2026-02-16 13:57:29.383 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:29 compute-1 nova_compute[185910]: 2026-02-16 13:57:29.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:30 compute-1 nova_compute[185910]: 2026-02-16 13:57:30.994 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:31 compute-1 nova_compute[185910]: 2026-02-16 13:57:31.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:31 compute-1 podman[217770]: 2026-02-16 13:57:31.938486556 +0000 UTC m=+0.082276717 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 13:57:32 compute-1 nova_compute[185910]: 2026-02-16 13:57:32.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:33 compute-1 nova_compute[185910]: 2026-02-16 13:57:33.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:34 compute-1 nova_compute[185910]: 2026-02-16 13:57:34.385 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:34 compute-1 nova_compute[185910]: 2026-02-16 13:57:34.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:35 compute-1 podman[195236]: time="2026-02-16T13:57:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:57:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:57:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:57:35 compute-1 podman[195236]: @ - - [16/Feb/2026:13:57:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 13:57:36 compute-1 nova_compute[185910]: 2026-02-16 13:57:36.007 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:38 compute-1 podman[217796]: 2026-02-16 13:57:38.925965171 +0000 UTC m=+0.065002933 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.387 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.630 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.658 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.658 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.659 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.659 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.842 185914 WARNING nova.virt.libvirt.driver [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.844 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5819MB free_disk=73.22296905517578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.845 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.845 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.934 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.935 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.962 185914 DEBUG nova.compute.provider_tree [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed in ProviderTree for provider: 63898862-3dd6-49b3-9545-63882243296a update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.981 185914 DEBUG nova.scheduler.client.report [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Inventory has not changed for provider 63898862-3dd6-49b3-9545-63882243296a based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.983 185914 DEBUG nova.compute.resource_tracker [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 13:57:39 compute-1 nova_compute[185910]: 2026-02-16 13:57:39.983 185914 DEBUG oslo_concurrency.lockutils [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:57:40 compute-1 nova_compute[185910]: 2026-02-16 13:57:40.978 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:41 compute-1 nova_compute[185910]: 2026-02-16 13:57:41.009 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:42 compute-1 nova_compute[185910]: 2026-02-16 13:57:42.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:42 compute-1 nova_compute[185910]: 2026-02-16 13:57:42.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 13:57:42 compute-1 nova_compute[185910]: 2026-02-16 13:57:42.632 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 13:57:42 compute-1 nova_compute[185910]: 2026-02-16 13:57:42.645 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 13:57:44 compute-1 nova_compute[185910]: 2026-02-16 13:57:44.389 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:46 compute-1 nova_compute[185910]: 2026-02-16 13:57:46.011 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:46 compute-1 nova_compute[185910]: 2026-02-16 13:57:46.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:46 compute-1 nova_compute[185910]: 2026-02-16 13:57:46.659 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:57:46 compute-1 nova_compute[185910]: 2026-02-16 13:57:46.660 185914 DEBUG nova.compute.manager [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 13:57:49 compute-1 nova_compute[185910]: 2026-02-16 13:57:49.392 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:49 compute-1 openstack_network_exporter[198096]: ERROR   13:57:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:57:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:57:49 compute-1 openstack_network_exporter[198096]: ERROR   13:57:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:57:49 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:57:50 compute-1 sshd-session[217820]: Invalid user ubuntu from 2.57.122.210 port 54214
Feb 16 13:57:50 compute-1 sshd-session[217820]: Connection closed by invalid user ubuntu 2.57.122.210 port 54214 [preauth]
Feb 16 13:57:51 compute-1 nova_compute[185910]: 2026-02-16 13:57:51.013 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:53 compute-1 podman[217822]: 2026-02-16 13:57:53.922879089 +0000 UTC m=+0.055932261 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 13:57:53 compute-1 podman[217823]: 2026-02-16 13:57:53.927760642 +0000 UTC m=+0.057806383 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 16 13:57:54 compute-1 nova_compute[185910]: 2026-02-16 13:57:54.395 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:55 compute-1 ovn_controller[96285]: 2026-02-16T13:57:55Z|00230|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Feb 16 13:57:56 compute-1 nova_compute[185910]: 2026-02-16 13:57:56.015 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:57:58 compute-1 sshd-session[217864]: Invalid user postgres from 188.166.42.159 port 55398
Feb 16 13:57:58 compute-1 sshd-session[217864]: Connection closed by invalid user postgres 188.166.42.159 port 55398 [preauth]
Feb 16 13:57:59 compute-1 nova_compute[185910]: 2026-02-16 13:57:59.397 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:01 compute-1 nova_compute[185910]: 2026-02-16 13:58:01.019 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:02 compute-1 podman[217866]: 2026-02-16 13:58:02.965310212 +0000 UTC m=+0.109766725 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2)
Feb 16 13:58:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:58:03.374 105573 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 13:58:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:58:03.375 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 13:58:03 compute-1 ovn_metadata_agent[105568]: 2026-02-16 13:58:03.375 105573 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 13:58:04 compute-1 nova_compute[185910]: 2026-02-16 13:58:04.407 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:05 compute-1 podman[195236]: time="2026-02-16T13:58:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 13:58:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:58:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16011 "" "Go-http-client/1.1"
Feb 16 13:58:05 compute-1 podman[195236]: @ - - [16/Feb/2026:13:58:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 16 13:58:06 compute-1 nova_compute[185910]: 2026-02-16 13:58:06.024 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:09 compute-1 nova_compute[185910]: 2026-02-16 13:58:09.414 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:09 compute-1 podman[217893]: 2026-02-16 13:58:09.92882173 +0000 UTC m=+0.072958063 container health_status c45861c2eb0fc262f5655d024356b9cde515106880236f6c8b34d55c3b113a41 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 13:58:11 compute-1 nova_compute[185910]: 2026-02-16 13:58:11.026 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:14 compute-1 nova_compute[185910]: 2026-02-16 13:58:14.417 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:16 compute-1 nova_compute[185910]: 2026-02-16 13:58:16.029 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:19 compute-1 openstack_network_exporter[198096]: ERROR   13:58:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 13:58:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:58:19 compute-1 openstack_network_exporter[198096]: ERROR   13:58:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 13:58:19 compute-1 openstack_network_exporter[198096]: 
Feb 16 13:58:19 compute-1 nova_compute[185910]: 2026-02-16 13:58:19.419 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:21 compute-1 nova_compute[185910]: 2026-02-16 13:58:21.030 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:22 compute-1 sshd-session[217918]: Accepted publickey for zuul from 192.168.122.10 port 54130 ssh2: ECDSA SHA256:6P1Q60WP6ePVjzkJMgpM7PslBYS6YIK3pVS2L1FZ4Yk
Feb 16 13:58:22 compute-1 systemd-logind[821]: New session 52 of user zuul.
Feb 16 13:58:22 compute-1 systemd[1]: Started Session 52 of User zuul.
Feb 16 13:58:22 compute-1 sshd-session[217918]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 13:58:22 compute-1 sudo[217922]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 16 13:58:22 compute-1 sudo[217922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 13:58:24 compute-1 nova_compute[185910]: 2026-02-16 13:58:24.421 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:24 compute-1 podman[218061]: 2026-02-16 13:58:24.731986012 +0000 UTC m=+0.056319242 container health_status 6a6deaf2cd1214aac742e70f94fb9eca8558ddcba836ac6055ae52735f888879 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 13:58:24 compute-1 podman[218057]: 2026-02-16 13:58:24.738337964 +0000 UTC m=+0.062690304 container health_status 63264a63c0295826a748a5f4cfce71fbf54dc091f97d6dc23b8572add1093935 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-f37eba2e1f8dd4be2b97d68dab0d9888fc4029bed8a8c37d466d49cf8acd1e6a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Feb 16 13:58:26 compute-1 nova_compute[185910]: 2026-02-16 13:58:26.032 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:26 compute-1 ovs-vsctl[218124]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 16 13:58:27 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 217946 (sos)
Feb 16 13:58:27 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 16 13:58:27 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 16 13:58:27 compute-1 virtqemud[185025]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 16 13:58:27 compute-1 virtqemud[185025]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 16 13:58:27 compute-1 virtqemud[185025]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 16 13:58:28 compute-1 crontab[218537]: (root) LIST (root)
Feb 16 13:58:29 compute-1 nova_compute[185910]: 2026-02-16 13:58:29.424 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:29 compute-1 nova_compute[185910]: 2026-02-16 13:58:29.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:58:30 compute-1 kernel: /proc/cgroups lists only v1 controllers, use cgroup.controllers of root cgroup for v2 info
Feb 16 13:58:30 compute-1 systemd[1]: Starting Hostname Service...
Feb 16 13:58:30 compute-1 systemd[1]: Started Hostname Service.
Feb 16 13:58:31 compute-1 sshd-session[218616]: Invalid user oracle from 146.190.226.24 port 38596
Feb 16 13:58:31 compute-1 nova_compute[185910]: 2026-02-16 13:58:31.037 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:31 compute-1 sshd-session[218616]: Connection closed by invalid user oracle 146.190.226.24 port 38596 [preauth]
Feb 16 13:58:33 compute-1 nova_compute[185910]: 2026-02-16 13:58:33.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:58:33 compute-1 nova_compute[185910]: 2026-02-16 13:58:33.632 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 13:58:33 compute-1 podman[218974]: 2026-02-16 13:58:33.938399389 +0000 UTC m=+0.078796882 container health_status 6ce1dfc1a1494fff750397a8656d21681dbf370755d74db419e2c5b342399ac1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '2559ee7e54d1fadf763e4f0700344e5d93945155415e606383ad0b050e358c51-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6-fd177e8978f24ea94c71d1d4c6b38347018b96e95bb4fea6e4c42643ae98deb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 16 13:58:34 compute-1 nova_compute[185910]: 2026-02-16 13:58:34.427 185914 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 13:58:34 compute-1 nova_compute[185910]: 2026-02-16 13:58:34.631 185914 DEBUG oslo_service.periodic_task [None req-ea1e350b-3634-4cb7-add9-7ce1267c0e4d - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
